We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The orbital precessions of the Keplerian orbital elements induced by the 1pN gravitomagnetic spin octupole moment of a rigidly rotating oblate spheroid are calculated in their full generality for an arbitrary orientation of the primary’s spin axis and a general orbital configuration of the test particle.
Proteins are vital biological macromolecules that execute biological functions and form the core of synthetic biological systems. The history of de novo protein has evolved from initial successes in subordinate structural design to more intricate protein creation, challenging the complexities of natural proteins. Recent strides in protein design have leveraged computational methods to craft proteins for functions beyond their natural capabilities. Molecular dynamics (MD) simulations have emerged as a crucial tool for comprehending the structural and dynamic properties of de novo-designed proteins. In this study, we examined the pivotal role of MD simulations in elucidating the sampling methods, force field, water models, stability, and dynamics of de novo-designed proteins, highlighting their potential applications in diverse fields. The synergy between computational modeling and experimental validation continued to play a crucial role in the creation of novel proteins tailored for specific functions and applications.
The search for extraterrestrial intelligence (SETI) represents a well-known area of astrobiology. This chapter is dedicated to technosignatures, that is, markers produced by extraterrestrial intelligences (ETIs). The famous Drake equation for roughly estimating the number of communicative ETIs is introduced, its various factors are defined, and some of its shortcomings and implications for detecting technosignatures are discussed. Next, the Fermi paradox is delineated: if ETIs are widespread, where are they? Three major classes of solutions to this classic paradox (e.g., we are effectively alone) are considered, along with their accompanying ramifications. After a brief segue into the Kardashev scale for grouping ETIs, the final segment of the chapter categorises the diverse landscape of technosignatures – ranging from artificial radio and optical signals to atmospheric pollutants and waste heat arising from energy harvesting and dissipation – and outlines the current limits derived for the frequency of technosignatures, as well as the anticipated future constraints in this context.
The impact of the 1pN gravitoelectric mass monopole acceleration, both in the test particle and in the two-body system of finite, comparable masses cases, is calculated for different types of observation-related quantities (Keplerian orbital elements, anomalistic, draconitic, and sidereal orbital periods, two-body range and range rate, radial velocity curve and radial velocity semiamplitude of spectroscopic binaries, astrometric angles RA and dec., times of arrival of binary pulsars, characteristic timescales of transiting exoplanets). The results are applied to a test particle orbiting a primary, a Sun–Jupiter exoplanet system, and to a S star in Sgr A*.
In this chapter we develop the contextual approach to quantum mechanics.This approach matches with the views ofBohr who emphasized that the quantum description represents complexes ofexperimental physical conditions, in the modern terminology – experimentalcontexts. In this chapter we formalize the contextual approach on the basisof contextual probability theory which is closely connected with generalizedprobability theory (but interpretationally not identical with it). The contextualprobability theory serves as the basis of the contextual measurementmodel (CMM). The latter covers measurements in classical, quantum, andquasi-classical physics.
This chapter is a step towards understanding why quantum nonlocalityis a misleading concept. Metaphorically speaking, quantum nonlocality isJanus faced. One face is an apparent nonlocality of the state update basedon the Luders projection postulate. It can be referred as intrinsic quantumnonlocality. And the other face is subquantumnonlocality: by introducing a special model with hidden variables onederives the Bell inequality and claims that its violation implies the existenceof mysterious instantaneous influences between distant physical systems(Bell nonlocality). According to the Luders projection postulate, aquantum measurement performed on one of the two distant entangled physicalsystems, say on S1, modifies instantaneously the state of S2. Therefore, ifthe quantum state is considered to be an attribute of the individual physicalsystem (Copenhagen interpretation) and if one assumes thatexperimental outcomes are produced in a random way one arrives at the contradiction. It is a primary source of speculation about aspooky action at the distance. But Einstein had already pointed out that the quantum paradoxes disappear, ifone adopts the statistical interpretation.
This work aims to perform a parametric study on a round supersonic jet with a design Mach number Md = 1.8, which is manipulated using a single steady radial minijet with a view to enhancing its mixing. Four control parameters are examined, i.e. the mass flow rate ratio Cm and diameter ratio d/D of the minijet to main jet, and exit pressure ratio Pe/Pa and fully expanded jet Mach number Mj, where Pe and Pa are the nozzle exit and atmospheric pressures, respectively. Extensive pressure and schlieren flow visualization measurements are conducted on the natural and manipulated jets. The supersonic jet core length Lc/D exhibits a strong dependence on the four control parameters. Careful scaling analysis of experimental data reveals that Lc/D = f1(Cm, d/D, Pe/Pa, Mj) may be reduced to Lc/D = f2(ξ), where f1 and f2 are different functions. The scaling factor $\xi = J({d_i}/{D_j})/(\gamma M_j^2{P_e}/{P_a})$ is physically the penetration depth of the minijet into the main jet, where $J({d_i}/{D_j})$ is the square root of the momentum ratio of the minijet to main jet (di and Dj are the fully expanded diameters of d and D, respectively), γ is the specific heat ratio and $\gamma M_j^2{P_e}/{P_a}$ is the non-dimensional exit pressure ratio. Important physical insight may be gained from this scaling law into the optimal choice of control parameters such as d/D and Pe/Pa for practical applications. It has been found for the first time that the minijet may induce a street of quasi-periodical coherent structures once Cm exceeds a certain level for a given ${P_e}/{P_a}$. Its predominant dimensionless frequency Ste (≡ feDj/Uj) scales with a factor $\zeta = J({d_i}/{D_j})\; \sqrt {\gamma M_j^2{P_e}/{P_a}} $, which is physically the ratio of the minijet momentum thrust to the ambient pressure thrust. The formation mechanism of the street and its role in enhancing jet mixing are also discussed.
The manifold requirements for a world to sustain habitability on long timescales (continuous habitability) are delineated in this chapter. The first part offers a brief introduction to climate physics (e.g., greenhouse effect), and thereupon formulates the notion of the habitable zone, that is, the region where liquid water could exist on rocky planets orbiting stars; the boundaries of the habitable zone as a function of the stellar temperature are also presented. In the second part, the various stellar factors potentially involved in regulating planetary habitability are sketched: winds, flares and space weather, and electromagnetic radiation. The third part chronicles some planetary variables that may affect habitability: mass, plate tectonics, magnetic field, tidal locking, and atmospheric composition. The last part is devoted to examining the high-energy astrophysical processes that might impact habitability on galactic scales: candidates in this regard include supernovae, gamma-ray bursts, and active supermassive black holes.
We consider the initial ‘slumping phase’ of a lock-release gravity current (GC) on a down slope with focus on particle-driven (turbidity) flows, in the inertia–buoyancy (large Reynolds number) and Boussinesq regime. We use a two-layer shallow-water (SW) model for the depth-averaged variables, and compare the predictions with previously published experimental data. In particular, we analyse the empirical conclusion of Gadal et al. (J. Fluid Mech., vol. 974, 2023, A4) that the slumping displays a constant speed for a significant range of slopes and particle-sedimentation speeds. We emphasize the physical definition of the slumping phase (stage): the adjustment process during which (a) the fluid in the lock is set into motion by the dam break, then (b) forms a tail from the backwall to the nose. We focus on the question of if and when the propagation speed $u_N$ of the nose (front) of the GC is constant during this process (there is consensus that a significant deceleration of $u_N$ appears in the post-slumping stage.) The SW theory predicts correctly the adjustment of the flow field during the slumping stage, but indicates that a constant $u_N$ appears only for the classical case ($\gamma =E=c_D=\beta =0$) where $\gamma, E, c_D, \beta$ are the slope, entrainment and drag coefficients, and the scaled particle settling speed for a particle-driven GC. However, since $\gamma, E, c_D, \beta$ are typically small, the change of $u_N$ during the slumping phase is also small in many cases of interest. The interaction between the various driving and hindering mechanisms is elucidated. We show that, in a system with a horizontal (open) top (typical laboratory experiments), the height of the ambient increases along the slope, and this compensates for buoyancy loss due to particle sedimentation. We point out the need for further experimental and simulation studies for a better understanding of the slumping phase and transition to the next phases, and further assessment/improvement of the SW predictions.
The hidden variable project realized by Bell contradicts the uncertaintyand complementarity principles. The inequalities derived with Bell’s hidden variablesare violated for quantum observables. Thus, Bell’s hidden variables shouldbe rejected and the validity of quantum theory is confirmed. (This foundationalachievement deserved the Nobel Prize in 2022.) This scientific loop,ignorance of the uncertainty and complementarity principles – hidden variablesmodel – Bell’s inequalities – their violation – reestablishing the validityof the uncertainty and complementarity principles, was stimulating for quantumfoundations. However, Bohr and Heisenberg might say that such resultscan be expected from the very beginning. For them, the uncertainty andcomplementarity principles form the basis of quantum physics. And theycan’t be rejected, since they are the consequences of the so-called quantum postulate– the existence of an indivisible quantum of action given by Planck’sconstant h. The quantum postulate is the ontological basis of quantum theory.I formulated its epistemic counterpart in the form of the principle ofquantum action invariance.
We analyze interrelation of quantum and classical entanglement. The latternotion is widely used in classical optic simulation of the quantum-likefeatures of light. We criticize the common interpretation that quantum nonlocalityis the basic factor differentiating these two sorts of entanglement. Instead,we point to the Grangier experiment on photon existence, the experimenton the coincidence detection. Classical entanglement sources produce lightbeams with the coefficient of second-order coherence g(2)(0) ≥ 1. This featureof classical entanglement is obscured by using intensities of signals indifferent channels, instead of counting clicks of photodetectors. Interplaybetween intensity and clicks counting is not just a technicality. We emphasizethe foundational dimension of this issue and its coupling with theBohr’s statement on individuality of quantum phenomena.
In this chapter we start with methodological analysis of the notion of scientifictheory and its interrelation with reality. This analysis is based onthe works of Helmholtz, Hertz, Boltzmann, and Schrödinger (and reviewsof D’ Agostino). Following Helmholtz, Hertz established the “Bild concept”for scientific theories. Here “Bild” (“picture”) carries the meaning “model”(mathematical). The main aim of natural sciences is construction of thecausal theoretical models (CTMs) of natural phenomena. Hertz claimed thatCTM cannot be designed solely on the basis of observational data; it typicallycontains hidden quantities. Experimental data can be described by anobservational model (OM), often at the price of acausality. CTM-OM interrelationcan be tricky. Schrödinger used the Bild concept to create CTM forquantum mechanics (QM) and QM was treated as OM. We follow him andsuggest a special CTM for QM, the so-called prequantum classical statisticalfield theory (PCSFT). QM can be considered as a PCSFT-image, but notas straightforward as in Bell’s model with hidden variables. The commoninterpretation of the violation of the Bell inequality is criticized from theperspective of the two-level structuring of scientific theories.
The contextual measurement model (CMM) that was invented in Chapter 10 representsthe wide range of non-Bayesian procedures for probability updatebased on context updates (or state updates). In this chapter we compareBayesian classical probability inference with general contextual probabilityinference. CMM is the basis of the Växjö interpretation of quantum mechanics,one of the contextual probabilistic interpretations. This interpretationhighlights that quantum update of probability (based on the state update)is one of the non-Bayesian updates. Quantum mechanics is interpreted as aprobability update machinery.