Pages

January 12, 2018

Aristotle has committed a giant step backward

NEW CREATION OF THE SELF online PG Diploma program by Ameeta Mehra - Enrol now for batch starting 15th Feb - https://t.co/XP59CCv19v
Program Objectives
  • To develop Tools for a New Vision of the Future for the Individual and Society
  • To use the faculty of imagination as a formative power
  • To develop inner psychological attitudes
  • To inwardise, intensify and expand the consciousness
  • To explore the role of the individual to bring about a shift in Society
Program Brief
The program on the ‘New creation of the Self’ explores the future possibilities of individual and collective growth towards a new vision. This change to a new consciousness is effected through use of the powerful faculty of Imagination and the inner psychological attitudes like aspiration, will, faith, trust and prayer. This is a path-breaking program towards imbibing knowledge through the faculty of imagination which unlike the faculty of memory has not been explored in academia. The students also learn ways to deepen, widen and intensify the consciousness. The aim is to create a new inner psychological foundation for viewing and engaging with the problems/issues of individual and societal progress.
...
Sri Aurobindo on the Rishi as Poet. https://t.co/VZziIWafwB
...

Dear Kashyap, Rajendra and others,

Kashyap,  you said in previous email [The only remaining thing, which admittedly is not part of theory of evolution is  to understand  how inorganic inert molecules gave rise to life.]

I was wondering of any of you have heard of the 2013 paper in Icarus by Makukov and shCherbak on the genetic code.  "The Wow! signal in the terrestrial genetic code". They argue very convincingly that the genetic code has been artificially designed the way it is in order to encode a message!  They claim to have found a pattern of exactly balancing nucleon numbers (in the amino acids coded for) across each of the code's well-known symmetries that always constitutes a multiple of 37. And in one of them when you divide the three totals by 37 you get 9, 16 and 25 (the squares of the sides of the 3-4-5 right angled triangle - a shape that would be familiar to mathematicians all across the universe). They make a very strong case for this because they have examined all the other potential variations of the code that nature could have used and no such patterns were found.  Their conclusion is that the genetic code has been artificially created (due to its highly durable nature) to encode a message from an advanced alien intelligence! I did look to see if the date of publication was the first of April.  But in actual fact the work seems to be getting a lot of positive mainstream publicity.  I read about it in a mathematics book published by New Scientist magazine which is a reasonably good secular science publication in the UK.

Although,  the authors suggest the engineers were aliens,  the 37 times table has wonderful symmetries of its own but only in the decimal number system.  Are we really to believe those aliens guessed that intelligent life on this planet would evolve to use such a system?  Hence I am more inclined to believe that it was a theistic God like the universewide consciousness that my theory of consciousness appears to predict. Such a being could know the future because he could have subsequently directed evolution to produce ten fingered brainy creatures.  
If that is the case then your worry about how life got started goes away. The important thing to my mind though is that the existence of such a being does not in the least bit make natural selection wrong or even redundant. Life forms have to be adapted to their various niches in order to survive,  so such a creator would need to work gradually over many many generations,  each time allowing the sieve of natural selection to determine what changes worked best. The only difference would be that not all the mutations would necessarily be random.  Such a being could try out mutations that worked before or elsewhere. Due to this memory effect you'd get a sort of exponential increase in complexity over time, which I think is what we do see. Yet it does not in the least bit allow us to excuse ourselves from the need to identify the selection pressures that led to the designlike attributes of our consciousness.  Each small step towards these features must have benefitted our ancestors or it would not have been retained.

I'd be interested to hear from anyone else who has come across that paper.  I couldn't see anything wrong with the line of argument. But then I knew very little about the genetic code before reading that paper so am no expert. I have yet to find any article debunking their claims.

All the best, 
Colin

C.  S.  Morrison - Author of THE BLIND MINDMAKER: Explaining Consciousness without Magic or Misrepresentation.




Dear Jo,

You may be right.  I don't presume to be able to tell in this case. And I do wish the authors would be explicit about how many alternative symmetries they examined that didn't give exact balances (after all the probability of hitting a multiple of 37 is just 1out of 37 - but remember the claims to improbability concern the exact balances not the nature of the balanced numbers). I made a rough estimate based on the standard deviation of the varying side chain masses for just one of their exact balances (I worked out how many different pairs of totals similar divisions of the code would be likely to produce) and got a probability of roughly 0.1 percent.  They also say the Rumer's bisection  - the symmetry associated with their exact balances - is itself highly improbable (which makes it interesting that the perfect balances are found there). They also claim to have tried many other possibilities and found no such balances (though I wish they would demonstrate this in the paper).

As far as being interested in such things is concerned, for me (as no doubt for Leibniz) coincidences are what science is about.  I would not criticise Leibniz for wondering if there is a scientific reason for the fortunate position of the moon at a time when life-forms capable of studying the solar corona and testing General Relativity have evolved on earth.  At the other extreme one could argue (reminiscent of Hume) that the patterns we call laws of nature are themselves just sets of lucky coincidences (the infinite multiverse cosmologies come to mind). Consciousness is used in humans to encode mainly sensory information so its designlike perfection with regard to that task is interesting (likewise with avocet beaks).

I believe in explaining an observed phenomenon in the way that is as similar as possible to how the most similar successfully-explained thing is explained. If that happens to be as a random coincidence then so be it. The question for me is merely what that most similar explained phenomenon is.

If the authors can use their patterns to make reliable predictions about other aspects of the genetic code, then the whole thing will get a lot more serious.  This may of course never happen. However,  the possible benefits of such a discovery are so great that it is definitely worth the risk looking for them and following up such claims.  That is of course why billions of dollars is spent each year hunting for earthlike exoplanets. Earthlikeness may have nothing to do with life but it is the best guess we have of where to find it.  Likewise patterns in the genetic code may have nothing to do with nonhuman intelligence, but they are in my opinion a good place to look for it.

Best wishes, 
Colin
...

The dominant scientific paradigm is still unaware that metaphysics and theology can be approached, at east according to *some* hypotheses, with the scientific method/attitude (which is modesty, no claim of truth, no ontological commitment, clear refutable theories (it is work demanding).

It took many trucks, hoists and cranes to install at IBM their first 5 MegaBytes hard drive, and now, you can put million of these MG in your pockets. All applications are universal programmed Chips, so yes, the computers are physical implementations of Universal Machine, provably so if you accept the theses (proved equivalent) by Church, Kleene, Post, Turing, etc.

That is what the universal machine/number/word likes to do the most: to transform themselves, with respect to other probable universal number. The nesting can be related with dreams inside dreams, but the dreams obeys the law of numbers, and limit of numbers. From inside, the nesting is truly infinite.

  But in contrast to a math duplicator,  the cwb   can't just materialize an imaginary  oxygen molecule out of the idealied benevolent number reservoir, but in the actual internal analog structural coding we are running,  we have to scavenge for an actual extra molecules so we can carry out the ~reaction(s) to completetion so that we get the water molecules involved in the internal structural coding.  

Only because you take for granted, perhaps, the idea that the fundamental reality is physical. I do not.

The physical is fundamental, but it is only the Clothes of God. The physical is, or should, only be a tool used by God to say “hello” to Itsef.

It is not the fundamentally “real” thing, which has admittedly slightly more “trivial”: the arithmetical reaiity, and, at some point, even only the semi-computable arithmetical reality.

But the key to understand is that such a correspondence should itself never been taken for granted, and digital mechanism is a type of religion: it acts some faith. At some point any theology takes the risk of blasphemy, directly or when misunderstood. Here, the mathematics of self-reference can be helpful.

Mechanism is the idea that our bodies are (natural) machine, which means they works through finite local interaction at some description level.

Then, accepting the Church-Turing thesis, it becomes a theorem that all computations are implemented by all computers, including in elementary arithmetic. 

Machines are finite objects, but they can’t avoid growing up and developing themselves through many histories. Then you have to take into account that the machine’s first person perspective is infinitely distributed on infinitely many computations, and “observably” so below their substitution level (which suggests that the substitution level is the quantum level, which is mainly an isolation notion, than a scale).

But for me, nothing is conceivably wilder than the arithmetical reality, up to the point of trying to explain how the apparent orders can emerge from the many-dreams realised in arithmetic.

You might confuse the mathematical theories and the mathematical reality. Since Gödel, we have good reason (even theorem assuming mechanism) to believe that the arithmetical reality is quite transcendent. I look for an explanation of the origin of the physical laws, which does not put consciousness under the rug, as the average Aristotelian materialist do. 
Usually Mechanism is advocated by materialist, but I try to explain this does not work. Then it is easier to explain how numbers can dream, and how some dreams can develop into stable sharable physical realities, but with no need of ontological stuff.

You just miss Gödel’s astounding achievement of the arithmetization of metamathematics. You are not alone, despite tuns of book, this is largely ignored, but it changes a lot the possible metaphysics/theology available when assuming Mechanism. The 19th century did have a reductionist conception of machine. Since Gödel and Turing, we know that the universal machine are terrible unknown. I did not see why you said the sentence above, and I was suggesting you might use a reductionist conception of machines.

If you limit my supply of oxygen I will die relatively to you. From my first person perspective I will survive in the computations where you don’t do that, or in any other consistent history close (in some sense related to the logic of self-reference).

It is the frightening aspect of mechanism: we can’t die. But theoretical computer science (intensional number theory) suggests the existence of jumps, if not, like I say above, much more sophisticated bardo. To leave the cycle of death and rebirth is not easy.

I mean I keep my mind open to the idea that our physical universal might have life form based on different constituents and set of laws, and doubly so with mechanism, where we are at light years to get something like our three dimensional or 24 dimensional physical theories. But physics, as ontological science, is refuted, from the mechanist perspective. But physics has not been invented for doing metaphysics, so that is rather normal.

I insist that my hypothesis is that “I” am a mechanism. (That is I survive with a machine as a body). Then I can show that this entails that both consciousness and matter are not mechanism. This comes from the first person indeterminacy: the act that if we are digital machine we are duplicable (at some right level of description), and we are indeterminate on all the computations going through our current local relative state. You need to study the seven steps of the “Universal Dovetailer Argument”, like in the sane04 paper. We are randomly selected on all equivalent computations (in arithmetic). 

When I was a young student, the term “consciousness” was a taboo word. I think that in occident, science has started with Pythagorus (-500) (including the serious theology … and also already its misuses and abuses) and has stopped with Damascius (+ 500). We are just in the Middle-Age. The Enlightenment Period has recovered the scientific attitude in all domains, except the fundamental one: theology, and the human science. Theology needs to come back at the academy of science, where we practice modesty through refutable theories. 

I would say that the flaw comes from making the 1p into an epiphenomenon when not putting it straight away under the rug. I think that Aristotle has committed a giant step backward. He did not understand Plato, and took granted the ontology of a physical universe. That has led us to variate form of materialism, which might be flawed or not, but we have to test this, and the incompatibility between Mechanism and Materialism leads us to such tests (and at first sight it favour mechanism on materialism).

Kind regards,

Bruno
...

It is the idea behind Gödel’s theorem, but once made mathematically precise it only shows that we cannot use reason to guarantie truth. We can only hope to be consistent and sound. It means that proof/belief should not be systematically associated with consistency and soundness. It is part of what I called machine’s metaphysics or theology. It is very important in machine theology, where invoking truth isa king to blasphemy. We have to stay modest. Intelligence is only the courage to try theories, and to admit we *might be* wrong.

Bruno


Dear Kashyap/Roman:

Consciousness is fundamentally a universal not biological phenomenon. Any local model  of biologically brain-induced consciousness is bound to be incomplete and bound to fail the same way universally as the standard cosmology model failing to predict 96% (dark energy/dark matter) of the universe.

A fundamental model of consciousness must be built as a wholesome universe model wherein the dark energy/dark matter problems are fully resolved withot any inconsistencies and paradoxes.

The above is based on my personal experice with the Universal Relativity Model integratng the missing fundamental physics of spontaneous decay into a cosmology model depicting a wholesome continuum of matter/mind/consciousness. Such model is vindicated by empirical observations of the universe and it also provides testable and falsifiable predictions for future vindication.

A local brain-induced consciousness model is like the model of a well trying to depict the ocean or model of a cloud trying to depict the vast space of the universe. It is also like showing candle to the sun.

Best Regards
Avtar

Dear Ram:
I agree with your statement: “…QM-nonlocality and FTL-IT are controversial and unclear."

The above inconsistencies and controversies are artefacts of the missing physics in current quantum theories that Einstein also pointed out. The following is a possible Special Relativity based explanation to resolve the dilemma:

Physical information constitutes non-zero mass-energy. Hence, a photon of light carrying and transferring information has a non-zero mass and cannot travel at the speed of light. Its V is very slightly less than C, but so close to C that difference from C is practically non-measurable or discernible via normal measurements. Hence, all information is transferred at V

There exist an uncountable number of zero-mass photons or Zero-point energy in V=C state of fully dilated space-time (eternal). However, this energy is unmeasurable as it exists in zero space-time state. We, human observers, and our senses or instruments can only perceive/measure non-zero mass photons carrying information from stars and galaxies completely missing on the zero point energy or photons.

Hence, your statement –“…. Since there is a transfer of physical information at v<=c, it is local at v=c as well. Physical information is transferred from sun to earth by light at v=c, but it takes time.” is in violation of special relativity as no information (non-zero mass can be physically transferred at V=C. We, human beings assume that sunlight photons have zero mass and V=C for practical purposes as we are unable to measure a velocity very very slightly less than but terriibly close to the speed of light. V=C assumption for sunlight photon is acceptable only as a FAPP (for all practical purposes) principle but theoretically (SR) incorrect. V>C (FTL-IT) is physical impossible since there is no space-time at or beyond V=C. Information can only exist and be transferred at V

A philosophical interpretation of V=C could be Cosmic Consciousness or eternal existence in the absolute Zero-point state that is purely kinetic, everywhere and all the time but unmeasurable in the implicate state (this is not far from common Brahma or God definition).

Best Regards
Avtar

Established beliefs in science are as difficult to overcome as in religion. But one can always try. I am working on a paper below and send it to the group soon when ready:

“What is Fundamental” – Missing Physics of Anti-gravity
Avtar Singh, Sc. D.
Massachusetts Institute of Technology Alumni
Center for Horizons Research, avsingh@alum.mit.edu

ABSTRACT
A fundamental concept or law represents the underlying foundation on which the next level or a comprehensive physical theory is built upon and without which a coherent and consistent description of empirical observations at all scales is impossible. The widely-accepted current mainstream theories – General Relativity (GR), Quantum Field Theory (QFT), Maxwell’s Theory, and Standard Big Bang Model (BBM), although vindicated by multiple worldly experiments, are known to exhibit inconsistencies and paradoxical results at universal scale pointing to possible missing fundamental physics. “What is fundamental” is exemplified in this paper via identifying a potential missing fundamental phenomenon of anti-gravity or spontaneous mass-energy conversion leading to spontaneous expansion as evidenced in the observed accelerated expansion of the universe. Relativistic formulations of this fundamental phenomenon provide a new photon dynamics model that eliminates inconsistencies in the current photon model of Maxwell’s theory. Integrating gravity into this model further provides a fundamental universe model that is shown to predict the observed universe behavior and resolves the current paradoxes (black hole singularity, dark energy, dark matter, inflation). It also explains the apparent weirdness of the inner workings of quantum mechanics (quantum gravity, parallel universes, observer’s paradox, and nonlocality) eliminating known inconsistencies of current theories. The model also provides testable predictions for falsification via future observations. The proposed model provides a new fundamental universal understanding of key concepts of physics, cosmology, and universal reality.

Regards
Avtar
 Jan 10, 2018

Dear Vinodji,

Experimentally you can run a beam of incident particles like electrons, photons or whatever onto a target consisting of particles you want to do experiment on such as atoms, molecules etc. Technically this is called scattering. Sometime two beams are made to run into each other. The outgoing particles will be entangled  to a certain extent. But more commonly, photons are passed  through crystals or even an ordinary plate called beam splitter to produce two beams which are entangled. Entanglement is produced by the usual quantum mechanical forces, expressed as Hamiltonian or Lagrangian.  These are spin dependent forces. The particles should be within range of interaction, not outside. This is different for each case.

The interesting thing is that what started as a mostly philosophical inquiry by Einstein about reality has become a big field of experimental physics! Entanglement is more common than we used to think. It may very well be  that particles run into each other naturally and get entangled. It is very likely that nature uses entanglement at many places. Stay tuned!

Best Regards.

kashyap

Hi Roman,

Interesting question. I will look into it. Seems to me that not much work has been done in entanglement in brain and in general in fluid systems. But if by scattering photons and electrons (most likely bound in atoms) off each other, one can produce entangled pairs, these particles surely get entangled when they run into each other because of fluid motion in the body or electrodynamic movements. At any rate this may be worth looking into. I understand Penrose and Hameroff use entanglement and subsequent collapse of wave function to suggest consciousness experience. Perhaps in your model of hydronic ions , similar thing can happen.

It is very interesting to note that what started out as a mostly philosophical inquiry on reality by Einstein has become a big experimental field in physics! Now we know that entanglement is more common than we used to think. It is quite likely that nature is using this phenomenon for some purpose.

Best Regards.

Kashyap

Dear Vinod/Kasyap:
The observed photon correlation and entanglement are explained by SR via space-time dilation at V close to C. The two apparently miles apart photons experience dilated space to zero hence remain correlated and appear entangled acting as the ONE and same photon.

Please also see my previous post - Speed V is ALWAYS < C; Even Photon of light moves at V

Thanks
Avtar
...

Will to Live or No-Will to Live? The Points of Convergence of the Thoughts of Schopenhauer, Wittgenstein, and Aurobindo on Living a Meaningful Life
KC Pandey - Schopenhauer on Self, World and Morality, 2017
… VI. Sri Aurobindo. As an Advaitin, Aurobindo believes in the Universal Will as well as individualwill … For Sri Aurobindo, the real problem of living a meaningless life does not lie outside in theworld. The problem lies within our own being at the individual level …

The Quintessence of the Upanishadic Wisdom and the Solace of Schopenhauer's Life
KM Pathak - Schopenhauer on Self, World and Morality, 2017
… My Indian reminiscences (A. King, Trans.). New Delhi: Asian Educational Services [First publishedLondon, 1893].Google Scholar. Isha Upanishad (Sri Aurobindo Ashram) … Cambridge: CambridgeUniversity Press.Google Scholar. Kena Upanishad (Sri Aurobindo Ashram) …

Introducing Schopenhauer's Philosophy of the World, Self and Morality in the Light of Vedantic and Non-Vedantic Wisdom
A Barua - Schopenhauer on Self, World and Morality, 2017
… Pandey argues that Schopenhauer's concept of the Will as 'thing-in-itself' and Will as 'thephenomenal appearance' can be seen from the perspective of Wittgenstein's 'Showable andSayable' distinction and Sri Aurobindo's distinction between 'Universal Will' and 'individual will' …

Schopenhauer on Self, World and Morality
A Barua
… the way Schopenhauer's thought stands to sources that inspired and moved the developmentof his thought, the relations of Schopenhauer to important figures as such as Abhinavagupta,Gaudapada, Bhartṛhari, Sankara, Rabindranath Tagore, Sri Aurobindo, KE Neumann …

No comments:

Post a Comment