Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 63 Reads
Colloquium on Gauge Transformations for Thermodynamic Fluxesand Thermal Diffusion
We discuss the molecular diffusion transport in infinitely dilute liquid solutions under non-isothermal conditions.  This discussion is motivated by an occurring misinterpretation of thermodynamic transport equations written in terms of chemical potential in the presence of temperature gradient.  The transport equations contain the contributions owned by a gauge transformation related to the fact that chemical potential is determined up to the summand of form (AT+B) with arbitrary constants A and B, where constant A is owned by the entropy invariance with respect to shifts by a constant value and B is owned by the potential energy invariance with respect to shifts by a constant value.  The coefficients of the cross-effect terms in thermodynamic fluxes are contributed by this gauge transformation and, generally, are not the actual cross-effect physical transport coefficients.  Our treatment is based on consideration of the entropy balance and suggests a promising hint for attempts of evaluation of the thermal diffusion constant from the first principles.  We also discuss the impossibility of the "barodiffusion" for dilute solutions, understood in a sense of diffusion flux driven by the pressure gradient itself.  When one speaks of "barodiffusion" terms in literature, these terms typically represent the drift in external potential force field (e.g., electric or gravitational fields), where in the final equations the specific force on molecules is substituted with an expression with the hydrostatic pressure gradient this external force field produces.  Obviously, the interpretation of the latter as "barodiffusion" is fragile and may hinder the accounting for the diffusion fluxes produced by the pressure gradient itself.
  • Open access
  • 88 Reads
The Entropy Conundrum: A Solution Proposal
In 2004, University of Michigan physicist Mark Newman, along with biologist Michael Lachmann and computer scientist Cristopher Moore, showed that if electromagnetic radiation is used as a transmission medium, the most information-efficient format for a given 1-D signal is indistinguishable from blackbody radiation. Since many natural processes maximize the Gibbs-Boltzmann entropy, they should give rise to spectra indistinguishable from optimally efficient transmission. In 2008, computer scientist C.S. Calude and physicist K. Svozil proved that "Quantum Randomness" (QR) is not Turing computable. While "true randomness" is a mathematical impossibility, the certification by value indefiniteness ensures that the quantum random bits are incomputable in the strongest sense. Algorithmic random sequences are incomputable, but the converse implication is false. In 2013, Politecnico di Milano academic scientist R.A. Fiorini confirmed Newman, Lachmann and Moore's result, creating analogous example for 2-D signal (image), as an application of CICT in pattern recognition and image analysis. Paradoxically if you don't know the code used for the message you can't tell the difference between an information-rich message and a random jumble of letters. This is an entropic conundrum to solve. Since its conception, scientific community has been laying itself in a double-bind situation. Even the most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" (RN) from any combinatorially optimized encoded message, which Fiorini called "deterministic noise" (DN). Entropy fundamental concept crosses so many scientific and research areas, but, unfortunately, even across so many different disciplines, scientists have not yet worked out a definitive solution to the fundamental problem of the logical relationship between human experience and knowledge extraction. So, both classic concept of entropy and system RN should be revisited deeply at theoretical and operational level. A convenient solution proposal will be presented.
  • Open access
  • 99 Reads
Thermodynamic Study of the Unsteady BGK Model for a Binary Gas Mixture Affected by a Nonlinear Thermal Radiation Field
In the present study, a development thermodynamic study of the papers [JNET, 2011, 36 (1), 75-98 and Can. J. of Phy., 2012, 90(2): 137-149] is introduced. The non-stationary BGK (Bhatnager- Gross- Krook) model of the kinetic equations for a rarefied gas mixture affected by nonlinear thermal radiation field is solved instead of the stationary equations. The unsteady solution gives the problem a great generality and more applications. The non-equilibrium thermodynamic properties of the system (gas mixture + the heated plate) is investigated. The entropy, entropy flux, entropy production, thermodynamic forces, kinetic coefficients are obtained for the system. The verification of the Boltzmann H-theorem, Le Chatelier principle, second law of thermodynamic and Onsager's reciprocity relation for the system are made. The ratios between the different contributions of the internal energy changes based upon the total derivatives of the extensive parameters are estimated via the Gibbs formula. The results are applied to the Argon-Neon binary gas mixture, for various values of both of molar fraction parameters and radiation field intensity. 3D-Graphics illustrating the calculated variables are drawn to predict their behavior. The results are discussed.
  • Open access
  • 109 Reads
Performance Optimization of Three-Heat-Source Irreversible Refrigerators Based Algorithm NSGAII
Throughout present research, an optimization investigation of an irreversible refrigeration absorption system on the basis of a new thermo-ecological criterion. The objective functions which considered are the specific entropy generation rate and the ecological coefficient of performance (ECOP). Two objective functions of the ecological coefficient of performance and the specific entropy generation rate are optimized simultaneously using the multi-objective optimization algorithm NSGAII. ECOP has been maximized and specific entropy generation rate is minimized in order to get the best performance. Decision making has been done by means of two methods of LINAMP and TOPSIS. Finally, sensitivity analysis and error analysis was performed for the system.
  • Open access
  • 77 Reads
Bayesian Estimation of the Entropy of the Half-Logistic Distribution Based on Type-II Censored Samples
,
This paper estimates the entropy of the half-logistic distribution with the scale parameter based on Type-II censored samples. The maximum likelihood estimator and the approximate confidence interval are derived for entropy. For Bayesian inferences, a hierarchical Bayesian estimation method is developed using the hierarchical structure of the gamma prior distribution which induces a noninformative prior. The random-walk Metropolis algorithm is employed to generate Markov chain Monte Carlo samples from the posterior distribution of entropy. The proposed estimation methods are compared through Monte Carlo simulations for various Type-II censoring schemes. Finally, real data are analyzed for illustration purposes.
  • Open access
  • 66 Reads
Detection and Classification of Anomalies in Network Traffic Using Generalized Entropies and OC-SVM with Mahalanobis Kernel
Network anomaly detection and classification is an important open issue of network security. Several approaches and systems based on different mathematical tools have been studied and developed. Among them, the Anomaly-Network Intrusion Detection System (A-NIDS), this monitors network traffic and compares it against an established baseline of "normal" traffic profile. Then, it is necessary to characterize the "normal" Internet traffic. This paper presents an approach for anomaly detection and classification based on: The entropy of selected features (including Shannon, Renyi and Tsallis entropies); the construction of regions from entropy data employing the Mahalanobis distance (MD) and One Class Support Vector Machine (OC-SVM) with different kernels (RBF and Mahalanobis) for normal and abnormal traffic. Regular and non-regular regions built from "normal" traffic profiles, allow the anomaly detection; whilst the classification is performed under the assumption that regions corresponding to the attack classes have been characterized previously. Although, this approach allows the use of as many features as required, only four well known significant features were selected in our case. To evaluate our approach two different data sets were used: One set of real traffic obtained from an Academic LAN, and the other a subset of the 1998 MIT-DARPA set. The selected features sets computed in our experiments provide detection rates up to 99.98% with "normal" traffic and up to 99.05% with anomalous traffic and false alarm rate of 0.019%. Experimental results show that certain values of the q parameter of the generalized entropies and the use of OC-SVM improves the detection rate of some attack classes, due to a better fit of the region to the data. Besides, our results show that MD allows to obtain high detection rates with an efficient computation time, while OC-SVM achieved detection rates lightly more precise but more expensive computationally.
  • Open access
  • 97 Reads
Size Distribution of Portuguese Firms between 2006 and 2012
This study aims to describe the size distribution of Portuguese firms, as measured by annual sales and total assets, between 2006 and 2012, giving an economic interpretation for the time evolution of the distribution. Three distributions are fitted to data: the lognormal, the Zipf and the Simplified Canonical Law (SCL). Methods of estimation include Maximum Likelihood, modified Ordinary Least Squares in log-log scale and Nonlinear Least Squares considering the Levenberg-Marquardt algorithm. Lognormal and Zipf distributions can be justified from the Gibrat's law and, in Zipf case a constraint must be added, for instance, on the minimum size of firms. As for SCL, firstly presented by Mandelbrot in the context of linguistics, the argument used to deduce the distribution of words frequency in texts is adapted to the distribution of firms. This is done by defining production units which can be glued to constitute firms so that the network of firms can be used flexibly to satisfy the most diverse needs in the economy in the less costly way and reducing at the maximum the delay in achieving it. For this distribution, different interpretations of the estimated parameters found in the literature are confronted and discussed in the light of data. Using this interpretation, we attempt to find a characterization of the Portuguese economic activity rhythm, diversity of activity sectors and competition. Diversity is captured by an entropy measure at different levels of aggregation and not just at the higher level as is usually done in literature. The interrelations between these economic characteristics and others, such as the level of concentration of firm size distribution, are reassembled from literature and confronted. Analyzing entropy at different levels of aggregation allows verifying how the evolution of activity rhythm can have a distinct impact in diversity at those different levels.
  • Open access
  • 82 Reads
Mechanical Generation of Networks with Surplus Complexity
In previous work I examined an information based complexity measureof networks with weighted links. The measure was compared with thatobtained from by randomly shuffling the original network, forming anErd\"os-R\'enyi random network preserving the original link weightdistribution. It was found that real world networks almost invariablyhad higher complexity than their shuffled counterparts, whereasnetworks mechanically generated via preferential attachment didnot. The same experiment was performed on foodwebs generated by anartificial life system, Tierra, and a couple of evolutionary ecologysystems, \EcoLab{} and WebWorld. These latter systems often exhibitedthe same complexity excess shown by real world networks, suggestingthat the {\em complexity surplus} indicates the presence ofevolutionary dynamics.In this paper, I report on a mechanical network generation systemthat does produce this complexity surplus. The heart of the idea isconstruct the network of state transitions of a chaotic dynamicalsystem, such as the Lorenz equation. This indicates that complexitysurplus is a more fundamental trait than that of being an evolutionary system.
  • Open access
  • 73 Reads
A Quantitative Theory of Cognition with Applications
An abstract non-probabilistic model of "situations" from the "world" with a focus on the interplay between "Nature" and "Observer" is presented and some applications, probabilistic and non-probabilistic, discussed. Nature is the holder of "truth". Observer seeks the truth but is restricted to considerations of, especially, "belief", "action" and "control". Inference is based on these elements. To simplify, our model identifies action and control and derives these concepts from belief - recall that "belief is a tendency to act" (Good 1952).  "Knowledge" is mainly thought of as "perception", the way situations from the world are presented to Observer. "Interaction" connects truth and belief with knowledge. If interaction leads to undisrupted truth, you are in the "classical world". Adding probabilistic elements, you are led to elements of Shannon theory. If mixtures of truth and belief  represent the rules of the world, you are led to Tsallis entropy instead.The quantitative basis for the theory is the view that "knowledge is obtained at a cost" and the associated modelling by a "proper effort function" (inspired by the use in statistics of proper scoring rules). Specific situations depend on "preparations". Certain preparations defined by reference to the effort function is suggested to represent what can be known, the "knowable". Other philosophically inclined considerations serve to provide natural interpretations. This includes the introduction of game theoretical thinking in the interplay between Nature and Observer. "Entropy", "redundancy" and the related notion of "divergence" make sense in the abstract theory. Other notions from Information Theory also appear, e.g. the "Pythagorean (in)equalities" as known from information theoretical inference. But applications are not only to information theory. For example the (in)equalities just pointed to also lead to the classical geometric results bearing Pythagoras' name.  Among applications we point to notions of Bregman divergence and elaborations which are presently considerd for a relation to classical duality theory.
  • Open access
  • 57 Reads
Combined Use of Information Entropy and Bepipred Scores for Screening Ebola Virus Glycoprotein (GP) Sequences
Unprecedented epidemics caused by strains of Ebola virus (EBOV) are associated with extremely high mortality rates. Moreover, an anti-EBOV vaccine is not yet available. Because EBOV glycoprotein (GP) is the precursor of the proteins mediating the binding and internalization of the virus, EBOV GP is an attractive target for vaccine development. It is reported here that by means of combined use of information entropy (H) and Bepipred predictive epitope screening of GP amino acid sequences, four invariant peptide sequences were identified in GP of Zaire ZEBOV and Sudan SEBOV, the two most prominent strains of EBOV. These four peptide sequences were then subsequently used for direct GP search in a strain of EBOV with a GP sequence length different from that of the combined GP (ZEBOV, SEBOV) sequence set. It is concluded that the combined H/Bepipred screening procedure identified invariant peptides [1] that may be of value as components of potential anti-EBOV vaccines and [2] that the identified amino acid sequences enabled direct sequence search, despite differences in GP sequence length that precluded computation of H.
1 2 3 4
Top