Why no "twitter" or "FaceBook"
profile for WrennLaw.Com? I
sites and don't want to waste
time using them, updating them,
monitoring them, etc. It's not that
I don't understand the technology;
it's that I do understand it (as made
clear in my Continuing Legal
Education seminars on the internet,
digital-age ethics, electronic
discovery, privacy, waivers, etc.)
James R. Wrenn, Jr. Attorney At Law (Go to Main Page)
Science, Technology, Human Intelligence, Artificial Intelligence & Liberty.
Since the advent of civilization, human intelligence has reigned at the pinnacle of intelligence on the planet. Many (if not most) religious people, believe such intelligence arose "in the image of God" -- i.e., as intelligent beings with free-will. Non-theistic01 people believe such intelligence arose by evolution (in which process the role of a "Creator" cannot be scientifically excluded) yielding intelligent beings with free-will. Indeed, recognition of those two assertions is to recognize what ought to be understood as foundational principles for a Grand Unified Theory of Science, Philosophy And Religion (GUTSPAR).
But in this 21sth Century, artificial intelligence being invented by human intelligence is rapidly approaching a point at which artificial intelligence may have the power and effect of not only eclipsing human intelligence but also severely impairing, if not eliminating, human free-will. This may seem to be an incredibly radical and alarmist view, but I am not, and don't claim to be, the first to recognize the scope and magnitude of the threat technology poses to human free-will in the near, not distant, future. There's a single word that best describes "free-will" -- it's "liberty." (Some physicists may claim liberty -- i.e., free-will -- is a quantum illusion02, but that's another argument for another day.)
There can be no doubt that scientists on what will become the front-lines of the battle between technology and human free-will must play a vital role if human free-will -- i.e., liberty -- is to be preserved, but there's another profession with a vital role to play in such battle: The legal profession. How and why so? Historically enlightened analysis of the United States Constitution reveals how its provisions impose uniquely upon the legal profession the broad duty to protect liberty writ large-- For elaboration, see Footnote 03.
Recent Technological Milestones Pertaining to Science, Technology & Liberty: .
1930s Calculators -- Used by military for artillery
1940s: Alan Turing (Alan Turing) (Father of the Modern Computer) (Vacuum Tubes). Watch the movie Imitation Game if you haven't already done so. British computer at Bletchley Park. (Concept: Turing Machine) (Video Explanation of "Turing Machine")
1945: ENIAC computer (USA) (gigantic; 18,000 Vacuum Tubes) (See video) (See more at Smithsonian)
1950s: Transistors replace vacuum tubes.
1958: Sept 19, 1958 Integrated circuits (silicon chips) replace transistors
1960s: Early 1960's-- NASA -- Humans still computing -- Watch the movie Hidden Figures if you haven't already done so.
1965: Moore's Law: Progress with silicon chip has begun increasing exponentially. Prediction by Intel's co-founder, Gordon Moore, that processing power would double (and size fall by half) every two years became known as "Moore's Law" and proved to be true for decades to come.
1969: Moon Landing 1969 Watch the documentary-movie "Apollo 11" if you haven't already done so. Pay attention to the "computer" warnings (by the computer guiding the lunar-lander for a landing) NASA deemed safe to ignore after which which Neil Armstrong (a test pilot with nerves of steel) calmly took control of the landing away from the computer with little more than a minute before the landing in order to land safely rather than to crash. (Wrenn-- Useful comparison: Now, look at your watch. If it's a digital watch, consider the fact that the computational power of the tiny chip-set inside your watch is much greater than the dramatically larger computer on the lunar-lander from which Armstrong took control.)
1980s: Birth of the Internet-- An advanced network (ARPANET) previously designed by U.S. Military becomes the "internet": In the early 1980's, use of the ARPANet began to expand to include commercial activity serving the needs of advanced research as well as needs of the Department of Defense. Such expansion intensified needs for protocols for interoperability ("Interop") and prompted collaborative activities to satisfy such needs. As interoperability between ARPANet and other networks increased, it became a network between and among networks (i.e., an "internet"). "By the middle of the 80's there were ARPANET gateways to external networks across North America, Europe, and in Australia, and the Internet was global in scope." Marty Lyons has created a [hand-drawn] map of the existing network gateways from 18 June 1985." In 1985 the NSF created the NSFNET to interactively link the super-computing centers of CSNET (which NSF had created in 1981) thus comprising "the first large-scale implementation of Internet technologies in a complex environment of many independently operated networks," with interoperability among them being founded on the concept envisioned by Paul Baran. In 1985, NSF considered how it could provide greater access to the high-end computing resources at its recently established supercomputer centers. Because NSF intended the supercomputers to be shared by scientists and engineers around the country, any viable solution had to link many research universities to the centers. NSFNET went online in 1986 and connected the supercomputer centers at 56,000 bits per second—the speed of a typical computer modem today. In a short time, the network became congested and, by 1988, its links were upgraded to 1.5 megabits per second. A variety of regional research and education networks, supported in part by NSF, were connected to the NSFNET backbone, thus extending the Internet’s reach throughout the United States. Creation of NSFNET was an intellectual leap. It was the first large-scale implementation of Internet technologies in a complex environment of many independently operated networks. NSFNET forced the Internet community to iron out technical issues arising from the rapidly increasing number of computers and address many practical details of operations, management and conformance. (Source for above) Such mid-1980's growth in needs for interoperability prompted collaborative activities leading to the first Interop "trade show" in 1988. By this time, this network extended throughout the United States (and for research and some governmental purposes also extended to research centers outside the U.S.). This stimulated expansion of the Internet to include private and commercial usage in additional to research, governmental and government-contracting uses. quotation.)
1989: Invention of "World Wide Web" to enhance functionality of the "internet" invented by the U.S.A: In 1989, Tim Berners-Lee and Robert Cailliau (scientists at CERN in Switzerland) began designing a proposal for ways to enhance the functionality of such network. Building on the concept of hypertext functionality invented by Doug Engelbart's "mouse," they completed the project by designing "HyperText Markup Language" (html) as programming code enhancing the functionality of the Internet by improving the ways computers could assemble, organize and display data and enable human interaction therewith. When they completed this project n 1990 they named it the "World Wide Web (or "WWW" or "www"). (More at http://www.w3.org/Consortium/history.html.)
1990s: Fears Rising About the Inevitable End of Moore's Law: In the 1990's computer-scientists world-wide were concerned that computer technology was fast approaching the end of "Moore's Law" -- i.e., that technology could no longer make it feasible to continue to decrease by half the size of hardware-memory (silicon chips) and thus be unable to continue doubling computational power every two years. There was a worldwide race among computer scientists and physicists to develop a way to replace hardware memory (silicon chips) with a molecular form of memory to overcome the end of Moore's Law.
2002: Exposé of scientific scandal (described immediately above): Virtually-unanimous-scientific-consensus of Nobel-Prize-worthiness of scientific claim of "molecular-chip" solution to the soon-approaching end of "Moore's Law" (exponential increases in silicon-chip computational power)-- See documentary video: Dark Secret of Hendrik Schön .
What was the scandal?
The "holy grail" being sought by computer engineers world-wide was how to design molecular memory to replace microscopic memory. To do so would enable a gigantic leap from today's micro-chip technology to molecular "chip" technology that would be much greater than the leap from radio tubes directly to micro-chips.
Jan Hendrik Schön, a brilliant physicist already widely respected in his field, claimed to have invented/designed a workable system for molecular memory. Nature, the world's most prestigious scientific periodical for peer-review and publication of scientific discoveries and knowledge, published (and, in doing so, lauded) three articles by Schön explaining his design of a working system for molecular memory. For years, world-wide scientific "consensus" accepted his work as valid. Scientific consensus was that it was not whether, but merely when, Schön would receive the Nobel Prize.
Then a graduate student working late into the early-morning hours on a research assignment discovered something strange about Schön's three articles: Three graphs (one in each article) purporting to show exquisitely detailed graphical plots of raw data (not regression analyses) from three experiments purported to have been conducted years apart were – down to the most minor detail—identical, which would be a scientific impossibility.
Although academia-pecking-order types of roadblocks impeded her efforts to expose the fraud at first, it wasn't too long before Nature magazine was trying to smile while wiping the three (count 'em, three) eggs off its face. Ultimately, of course, the young graduate student's work served to expose the fraud the scientific community of experts had so eagerly and readily embraced by "consensus" as true (without conducting any scientifically rigorous investigation).
Why did so many eminent scientists accept his "work" so uncritically? It seems their emotional investment in the desire for it to be true was so strong that it overpowered their training in scientific skepticism-- Sound familiar?
Scientific Scandal: Scientific Consensus Completely Fooled by Bogus Scientific Claim of Achieving Molecular Memory to Overcome Moore's Law--
Watch Dark Secret of Hendrick Schön: View entire video: [here] [here] [here] or [here].
View video starting at: [explanation of need to overcome Moore's Law]
Scientific investigation: [Investigative Report by Bell Labs in 2002]
BBC information about the documentary: [Archived Information here]
Alternate links to video-documentary Dark Secret of Hendrick Schön: entire video: [here] [here] [here] or [here] or beginning with explanation of "Moore's Law": [here].
Post-Scandal -- In the wake of the scandal, scientists worldwide intensified their efforts to develop/create workable molecular-level replacements for microscopic-level "chips."
2003: Quantum-Computing theories advancing for ways to manipulate data at a sub-atomic level.
2004: Google's Co-Founder, Larry Page, describes Google's long-term goal: "If you think the question, you will be provided the answer." [source] Note: The preceding quotation is a substantively accurate paraphrasing of what Larry Page literally said (in being interviewed by Steven Levy in writing the book, In the Plex, about the rise of Google): "Eventually you'll have an implant, where if you think about a fact, it [Google] will just tell you the answer." See entry infra in 2018 titled "Think-the-Question, Hear the Answer from Google via Skull Vibrations" (April 4, 2018)
2007: Quantum Computers -- How they would work-- Qubits at the sub-atomic level (i.e., even below the molecular level).
2010: Custom Proteins Drawn from Genetically Engineered Trees Expand Silicon Chips' Memory Capacity
2011: Quantum Levitation [link here] (October 16, 2011) (Not related to Quantum Computing, but interesting nevertheless to illustrate the strange differences between "normal" effects of physics and quantum effects of quantum physics -- but a different phenomenon known as quantum-entanglement is related to quantum computing -- See "Quantum Entanglement" 2013 below).
2012: Turning DNA into a hard drive: Stanford's Drew Endy and his lab figured out a way to turn DNA into a rewriteable data storage device that can operate within a cell. (Click the link for the entire article.)
2012: Tweaking Moore's Law and Computers of the Post-Silicon Era (Michio Kaku) .
2012: Nano-Technology .
2013: Quantum Entanglement (what Einstein called "spooky" action at a distance) Vastly Exceeds Speed of Light. This article [here] or [here] is about Chinese scientists measuring the speed with the change in the state of one of two "entangled" particles separated by a great distance automatically occurs in the second one is no slower than "10,000 times the speed of light." Note: Quantum entanglement is vital to a particular aspect of Quantum Computing. (March 17, 2013). (How does such action-at-a-distance -- i.e., the time for a local cause to yield an effect away far away -- How does this relate to Einstein's theory that nothing travel faster than light? Go here04 for a hypothesis)
2012: Internet of "Things" to Become Fruitful Sources for Surveillance (TVs, refrigerators, appliances, etc.), but remember, Google (and other data-mining tech giants) already know vastly more about you than the CIA could ever dream of knowing.
2013: Google Wants Microphones in Your Ceiling and Chips in Your Head. (December 9, 2013).
2014: Israeli Scientists Use DNA to Design Molecular-Wire for Computers.
2014: Google working on super-fast 'quantum' computer chip [here] if not [here]. (Difference between "quantum" chip and "molecular" chip?)
San Francisco (AFP) - Google said it is working on a super-fast "quantum" computer chip as part a vision to one day have machines think like humans. The Internet titan on Tuesday added renowned researcher John Martinis and his team at the University of California, Santa Barbara, to the Quantum Artificial Intelligence team at Google, according to director of engineering Hartmut Neven. The new hires are part of a "hardware initiative" to design and build chips operating on sub-atomic levels in ways making them exponentially faster than processors currently used in computers, "With an integrated hardware group the Quantum AI team will now be able to implement and test new designs," Neven said of the quest for a transformative new chip. Last year, Google's artificial intelligence lab partnered with US space agency NASA on quantum computing research.
2014: Silicon chip implanted in brain enables previously paralyzed man to move hand with his own thoughts. [here or here] (June 24, 2104).
2015: Prediction: DNA-Based Computers Soon to Replace Microscopic-Leven Silicon Chips.
2015: Silicon-chip brain implants used to enable patients with brain injuries disabling them from retaining short-term memories to become able to retain short-term memories in long-term memory -- an ongoing DARPA project. [here or here]. (September 29, 2015).
2017: Even silicon chips (i.e., not "molecular" chips or "quantum" chips) are already enabling brain-machine interfaces to help the disabled [here or here] (February 6, 2017)
2017: Prediction: DNA-Computer (Organic Computer) Coming Soon.
2017: Smart-Phones available with 256 Gigabytes [source] (How does that approximately 2.5" x 5.5" hand-held device compare with the gigantic size of ENIAC?) (Suppose we had to build such smart-phone using ENIAC technology. What would be the length of the ENIAC version of the smart-phone? Divide 256 Gigabytes by 2,250 bytes then multiply the result by the length of ENIAC (more than 40 feet) then divide by 5,280 feet = the number of miles, which would be approximately the diameter of the Sun).
2017: U.S. v. Carpenter [link] (Majority opinion opines, inter alia, that people consider the smart phone "almost a part of the human body.") (June, 2017) [See also "Google-- Ready to Market CSLI Smart-Phone Data" infra in 2019.]
2017: Hacking the Brain -- Surgical implants of "DNA strands" controlling movement (mice first, humans next) [here or here] [See also here or here] (August 18, 2017)
The effort, led by physics professor Arnd Pralle, PhD, of the University at Buffalo College of Arts and Sciences, focused on a technique called “magneto-thermal stimulation.” It’s not exactly a simple process — it requires the implantation of specially built DNA strands and nanoparticles which attach to specific neurons — but once the minimally invasive procedure is over, the brain can be remotely controlled via an alternating magnetic field. When those magnetic inputs are applied, the particles heat up, causing the neurons to fire. The study, which was published in the most recent edition of the journal eLife, includes experiments where were performed on mice. Using the new technique, the researchers were able to control the movement of the animals, causing them to freeze, lock up their limbs, turn around, or even run. Despite only being tested on mice, the research could have far-reaching implications in the realm of brain research. The holy grail for dreamers like Elon Musk is that we’ll one day be able to tweak our brains to eliminate mood disorders and make us more perfect creatures. This groundbreaking research could very well be an important step towards that future.
2017: Prediction: Chips inserted in brains will give us MIND-BLOWING abilities within years. (November 9, 2017).
2017: More Hacking the Brain: External instructions transmitted electronically into the brain (Rhesus monkeys for now) [here or here] (December 7, 2017)
2018: Think-the-Question, Hear the Answer from Google via Skull Vibrations. [here or here] (April 4, 2018) Although this appears to suggest that the technological means for virtually destroying free will has already arrived, that's not the case because under this particular procedure, the means for technological detection of the question is still externalized via use of an external device fitting on the outside of the skull, so it would still be possible for the questioner to choose the question, but at the speed at which organic versions of technology are advancing, the time of feasibility of an organic "implant" is in the near, not distant, future. What is likely to happen an when? Keep thinking that question as you continue reading to the end (or, if you're too impatient, click this and then come-back here).
2018: Brain-to-Brain connections -- Connecting two human brains to a third enables rudimentary thoughts of the two to influence choices by the third. (October 7, 2018)
Excerpt: The team used "electroencephalograms" (EEGs) to record electric impulses from two human brains and "transcranial magnetic stimulation" (TMS) to deliver information to a third brain. The end result: an interface that allowed three human subjects to collaborate and solve Tetris problems using brain-to-brain communication. In the test, two "senders" were connected to EEG sensors and communicated to a third person, the "receiver" via a TMS helmet with the ability to send flashes directly to the brain. Rest of the article here or here.
2018: Smart-Phones soon to become "wearable" or with "invisible interfaces" (November 13, 2018)
Excerpt: "The transition from smartphones to smart wearables and invisible interfaces -- earbuds that have biometric sensors and speakers; rings and bracelets that sense motion; smart glasses that record and display information -- will forever change how we experience the physical world," For the rest, go here or here.
2018: Thought-Controlled Televisions -- Soon to be released for testing by Samsung [here or here] (November 13, 2018)
2019: Google-- Ready to Market CSLI Smart-Phone Data???? [here] [couldn't be saved in WayBack] (January 28, 2019) Also, read the U.S. v. Carpenter case (supra, decided in 2017) .
2019: Human Thoughts Empowered/Implemented by Computer Technology (March 4, 2019): What if you could make money, or type something, just by thinking about it? It sounds like science fiction, but it might be close to reality.-- [Link] or [link]
2019: Brain-implanted chips for medical/rehabilitative purposes (March 4, 2019) -- Excerpt from end of article: "Approximately 40,000 people in the United States already have smart chips [for medical/rehabilitative purposes] in their heads, but those brain implants are only approved for medical use for now." [here or here]
2019: Brain-Implants -- Research toward using Artificial Intelligence to Convert Thoughts to Speech for people with brain-injuries/disabilities. [here or here] (April 24, 2019)
2019: Claim: China May be Close to Making "Brain-Chip-Interface" to connect human brain to computer. Links: [here] or [here] (Possible Russian propaganda) (May 6, 2019)
2019: Brain-hacking/Brain-Augmenting -- Experiments with rats showing promise for using implanted chips to favorably or unfavorably affect mental performance. [here or here] (June 24, 2019)
2019: Solar-powered flying-insect-robot [here or here] (June 27, 2019)
2019: Quantum Computing -- China in Forefront. The quantum revolution is coming, and Chinese scientists are at the forefront. (or here) (August 18, 2019)
2019: Smart-Phone-watch recording sexual activity of wearer (Applie apologizes for Siri surveillance). [here or here] (August 29, 2019)
2019: Tech-Giants Seeking to Use Hardware/Software to Read Thoughts -- Neuralink (now owned by Google) [here or here] (August 29, 2019).
2019: Quantum Computing -- Google Claims "Quantum Supremacy" [here or here] (September 20, 2019) "Quantum Supremacy" means that a "quantum" computer can perform tasks beyond the capabilities of whatever is the best non-quantum computer on the planet. (It uses sub-atomic quantum characteristics of sub-atomic particles rather than entire molecules or atoms.)
Excerpt: "While our processor takes about 200 seconds to sample one instance of the quantum circuit 1 million times, a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task," the researchers said. Google's quantum computer, dubbed "Sycamore," contained 53-qubits, or "quantum bits," a measure of the machine's potential power. The team scaled back from a 72-qubit device, dubbed "Bristlecone," it had previously designed. The researchers estimate that performing the same experiment on a Google Cloud server would take 50 trillion hours—too long to be feasible. On the quantum processor, it took only 30 seconds, they said. "Quantum processors based on superconducting qubits can now perform computations...beyond the reach of the fastest classical supercomputers available today," the researchers write. "To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor." For the entire article, click here or here.
What is "quantum" computing?? Start with this video here (but be patient) or Watch a shorter video: "Quantum Computing in a Nutshell" (with particular attention to the part starting at 22:54)
2019: Biological Computing -- Next Frontier. Former Google CEO Eric Schmidt believes biology is the next frontier in computing. (October 2, 2019).
2019: Terminally-ill scientist undergoing process to "survive" as first human Cyborg via brain-computer/machine interfaces. [here or here] (October 11, 2019).
2019: Mind-reading (provocative title) -- Thought-Reading -- [here or here] (October 16, 2019): Speculative, lightweight but enlightening article re progress thus-far on non-invasive algorithmic interpretations of thought patterns for physical movements versus far greater challenge for invasive mechanisms for interpreting and translating into machine-performance thought-patterns representing actual thinking rather than merely thought patterns for activating physical movements (i.e., arms, legs, etc.). But the article is conceptually correct in recognizing the not-distant-future feasibility of invasive interpretations (via nano-bots) of thoughts -- i.e., what really could be described as "mind-reading." [But see "2015: Silicon-chip brain implants" in the time-line above regarding invasive techniques enabling neural signals representing short-term memories to be algorithmically encoded and then transmitted and "re-broadcast" (perhaps in an analogue fashion) into the long-term-memory part of the brain to help patients with brain damage or deterioration preventing capture of important short-term memories into long-term memory. This is an example of "invasive" technology.]
2019: Research (by Argonne at Dept. of Energy) (October 18, 2019) on insect brains (bees, ants, etc.) pave way for biological computers -- organic computers -- neuromorphic computing -- as basis for further developing AI (Artificial Intelligence) Two related articles on the same subject: [here or here] [here or here]
2019: Re Google's June claims of "Quantum Supremacy" -- read the article -- headline is misleading -- (October 23, 2019) [here or here]: Google's prior claim (see September 20, 2019) was that computation made by Google's 50-qubit quantum computer performed in 200 seconds a task that IBM's best non-quantum silicon-chip computer would take 10,000 years. IBM says Google's claim founded on errors. IBM non-quantum computer could perform the particular task within less than 3 days (not 10K years), but nevertheless, the quantum computer is still a hell of a lot faster (i.e., 200 seconds versus 3 days) than the IBM but also the particular task has little practical utility because it was designed solely as a task suitable for measuring computational speed. (IBM is also working on quantum computing but has not made claims such as those recently made by Google.) Wrenn-- Given the scientific fraud exposed as the "Dark Secret of Hendrick Schön" described supra in "2002", it's wise to be skeptical of extraordinary claims, which should require extraordinary proof.
2019: Human memory: (October 31, 2019) (Main article here, which could not be archived in WayBack machine, but see also here or here OR here or here regarding what are apparently other publications' articles about the same research.) A drug is showing promise for medical/psychiatric progress in helping people suffering from PTSD "forget" terrifying or psychologically-disabling memories by use of a psycho-active type of drug that has the effect of "weakening" the terrifying memory when administered to patients while they recall such memories (with the "weakening" effect being analogized to the computer-data process of recalling particular files and then editing them before "saving" them with the result being the loss of an ability to recall such memory in ways that cause severe mental or emotional stress. Wrenn observation: Needless to say, as computer technology for interpreting human brainwaves in ways permitting algorithmic understanding of thinking and decision-making by the human brain, it's not difficult to expect that these two ostensibly unrelated scientific/medical advances may be married to together.
2019: Privacy -- Medical Records -- Health Records -- (November 13, 2019) (here or here) Google Accessing and Controlling Patients' Health & Medical Records -- Project Nightingale -- Ascension -- HIPPA Issues -- Massive data-bases -- Individually Identifying Information. Wrenn comment: Since privacy is a sine qua non for attorney-client confidentiality and privilege, any weakening of privacy is potentially relevant to any legal principle governing data or information resting on the existence of privacy, so we as lawyers should be continuously on alert for developments that potentially adversely affect privacy. Remember, privacy is the tree in the forest -- there's no confidentiality tree or privilege tree because confidentiality and privilege are merely branches on the privacy tree.
2019: Relevant to "think the question and you'll be provided the answer" (November 20, 2019) (Source: here or here): Scientist uses MRI to "read" human thought; "One of the most surprising discoveries, says Dr. Just, has been the fact that activity patterns in the brain when people think about even abstract ideas like spirituality, forgiveness and gossip, are common across people. They're even the same when people think in different languages. To study emotions, Just asked acting students to conjure up different feelings while having their brains scanned. Again, results showed common patterns. "Each emotion had its own characteristic values and you could tell which one was which," he tells Stahl. "Amazingly, it was common across people." (See also entries for 2004 and then 2018 regarding "think the question.") Wrenn: This November 20, 2019 article focuses on use of an MRI to identify brain patterns in order to "read" the subject's mind, but as molecular-memory (organic-computer technology) soon comes of age, algorithms used by organic computers will do the same thing but in an inherently invasive way. Thus, when one "thinks" the question, the real question will become whether the question is the question of the one thinking or rather the question suggested by the organic interface between the "thinker" and the organic-interface source for the answer (that "will be provided"). This is a concept at war with the fundamental nature of individual autonomy a.k.a. "liberty."
2019: Tim Berners-Lee (Nobel-Prize-Winning "inventor" of the "World Wide Web"05 (i.e., "html") is proposing a "bill of rights" for the internet to enable individuals to protect their privacy (November 24, 2019 here or here). Wrenn notation: Unfortunately, it's success depends upon big-data companies (whose success depends upon massively intrusive data-mining) to voluntarily agree to his proposals for "governance" of the internet in ways to protect both freedom of expression and privacy. I fear his effort is akin to tilting at the big-data windmills.
2019: Wrenn note: With organic computers now being on the verge of reality (computers biologically compatible with the human body), people wanting the power to have their "thoughts" translated into action (without having to hold a "smart-phone and use fingers or voice to command actions -- such as turning-on the lights at home while being away, etc.), will rush like lemmings to have working biological-interface "implants" to acquire such "augmented-reality" powers -- such "God-like" powers. They will be "thinking" the "questions" and being "provided" the answers, but they'll never know whether their thoughts in deciding which questions to "think" are their own thoughts or not and they'll passively receive the "answers." So-long to free will and liberty unless we as a society were to devise a way to prevent such extinction of individual autonomy. The legal profession's duty to protect liberty writ large has a huge responsibility to fulfill, if such extinction of autonomy is to be prevented.
2020: Employers requiring ID Badges in workplace will soon be replaced by employers requiring employees to have microchip (RF-ID chip) implanted in their bodies (e.g., in employee's hand, arm, neck, etc.) (January 7, 2020 here or here). Information encoded in such chips may be sufficient alone in most employment contexts, but in some contexts requiring higher security, such information will comprise part of the total "identification" function, which may also include external sensors with biometric data also embedded (such as a particular person's gait, height, eye-color, thumbprint, palm-print, voice, etc.).
01I use "non-theistic" to describe people who are not theists but also not anti-theists because they understand that anti-theism is an unscientific -- indeed, anti-scientific --form of religious fanaticism. Far too many atheists are fanatically religious anti-theists incapable of recognizing the fundamentally religious nature of their "anti-theism." Thus, I describe myself as a "non-theist" to prove the truth represented by the acronym NAAAAH (coined by me), which stands for "Not All Atheists Are Assininely Hubristic," the last two words of which phrase (coined by me) may be also understood to be somewhat euphemistic, but as a matter of editorial etiquette I choose not to provide synonymic translation (even if there were to be such a thing as a Euphemisaurus -- i.e., a Thesaurus for Euphemisms).
02At the beginning of quantum theories, the facts that experiments designed to detect photons as particles (and thereby prove they're not waves) while experiments designed to detect them as waves (and thereby not particles) led some quantum physicists to theorize that the mode of observation was what forced the photon to be a particle (or be a wave), which became known as "observer created reality" supporting a notion that in the quantum (sub-atomic) world, any particular state of reality does not occur until it is "observed," which became known as "observer-created reality" leading fringe-theorists to assert that each observer creates his or her own universe by the manner in which he or she observes it. Einstein (who did not like quantum physics) famously replied, "I think the Moon is still there even when I'm not looking at it."
Current quantum physicists recognize that "particle" or "wave" is a state easily recognized in our "macro" world, but that at the quantum level they are different behaviors of what is the same thing -- i.e., recognition of the particle/wave duality of virtually every aspect of forces at the quantum level. So, our inability to be certain about the particular state of any particular aspect of a particular quantum feature of the reality we're trying to observe/measure/quantify/identify is the "uncertainty" principle (which destroys the predictability, or deterministic nature, of pre-quantum Newtonian physics akin to the predictability of billiard balls).
Since (a) quantum-level uncertainty negates Newtonian predictability/determinism (the latter of which would negate "free will") and (b) the apparent ability of sub-atomic forces (such as an electron, photon, etc.) simultaneously to be what we would classify in the macro-world as a particle and what we would classify in the macro world as a "wave" or "field," some quantum physicists would argue that quantum physics proves "chaos" and thereby destroys the concept of "free will" by destroying the concept of complete predictability. But others could contend that whatever yet-undiscovered principle governs when and how a sub-atomic force behaves as a particle rather than as a wave (or vice versa) is the foundational basis for what manifests itself in the macro-world as "choice" or "free will." As I said, this is "another argument for another day."
Also, what is probably the strongest physical-experiment evidence repudiating the "observer-created reality" theories is the 2013 discovery of the Higgs boson in the super-collider at CERN, which supports existence of the Higgs "field" state of the Higgs boson as an everywhere force constantly and unrelentingly "observing" every other force (i.e., as a force that is "observing" the Moon for us while we're not "observing" it). So, the discovery of the Higgs boson (and the corresponding Higgs field") would have made Einstein happy (and, perhaps made Heisenberg and Schrödinger unhappy). And the "uncertainty" principle (i.e., the "wave/particle duality" of sub-atomic forces) appears to be vital to what is expected to enable "quantum" computers to eclipse current state-of-the-art computer-technology (which must treat every data-point as a 1 or 0) by use of "quantum software" to enable quantum-computers to simultaneously treat data-points as 1 and 0. Regarding quantum computing, see links in the time-line from 2017 through 2019.
03The legal profession is the only private-citizen profession ordained by the Constitution as vital to our form of government. The Sixth Amendment's explicit guarantee of the right to counsel is the beginning, not the end of such ordination. Other aspects of the Constitution make declarations and/or proceed upon assumptions in reference to which the function of our profession is essential thereto. How so? Two words: common law.
Common law comprised the vast bulk of criminal law for "criminal prosecutions" of any "crime" for which the Sixth amendment guaranteed the right to counsel. Common law comprised the bulk of civil law incorporated by the Seventh Amendment. Both criminal and civil common law included procedures and rules of evidence for defining, requiring and applying "due process" standards to protect "life, liberty or property" and the Fifth Amendment privilege against self-incrimination.
Such references to civil and criminal "common law" meant not a Constitutional creation of a single body of "federal" common law but rather (i) the incorporation of common law as defined and applied by the courts of the former-colonies/now-states, and (in addition thereto) (ii) a form of federal common law03a by virtue of Article III, Section 1 of the Constitution vesting the "judicial power" (an inherent aspect of common law) of the United States in "one supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish."
Such procedures and rules of evidence were manifestations of common-law assertions and exercises of inherent judicial power (by courts) creating and regulating the legal profession (as officers of the courts) for implementation of the adversarial process for legal proceedings and recognizing attorney-client privilege and confidentiality as essential elements thereof. In essence, what comprised common law was not only the substantive criminal and civil law that had evolved by the case-law method but also the case-law method itself as the mechanism for the continuing evolution of common law and also the inherent judicial power03b of courts (the judicial branch) over the administration of justice and the licensing and governance lawyers as "officers of the courts."
So, what is the import of all of the above? It's that the Constitution expressly and impliedly imposes upon our profession a vital role in for the preservation of Liberty writ large under our system of government. That lofty and awesome responsibility calls for "professionalism" rather than merely professional conduct that satisfies legal-ethics and/or civil standards. It invokes a Constitutionally-inspired (if not imposed) responsibility to advance and protect the independence of the judiciary and the vitality of our system for the administration of justice and to conduct ourselves in such a way as to enable ourselves and the system of justice to give confidence to "the People" that the system is fair (though imperfect) and that we as a profession are continuously seeking to protect it from threats to its integrity and to remove as many imperfections as possible.
It's not for the purpose of denigrating any other profession that I say we must recognize that our profession has the most noble and vital goal of all the professions. Exhibiting this kind of professionalism is the best way to reinforce the truth spoken by the treacherous character, Dick the butcher, in Shakespeare’s King Henry VI, in telling fellow plotters that to impose chaos and then tyranny, "The first thing we must do is kill all the lawyers."03c Of course, as lawyers, we all know how ubiquitously the quoted language alone appears in popular culture to convey the opposite view of lawyers by purporting to quote Shakespeare as evidence that he had the "wisdom" to suggest that "The first thing we must do is kill all the lawyers."
But all the good that such professionalism can accomplish can easily be undone by arrogance in the manner in which we practice it. Why? Such arrogance breeds contempt in the minds of the "People" and thereby engenders impulses to demand legislative and/or executive actions to curb what they thereby may perceive as "excesses" of the "un-elected" judiciary. So, part of effective professionalism is to muster a degree to humility in ourselves and seek to inspire it in others in our profession and among functionaries in our judiciaries to educationally persuade (not sanctimoniously lecture) the "People" that perpetuation of common law (including independence of the judiciary and its "inherent judicial power"03d to utilize and govern the legal profession) comprised a vital tool the Constitution used to enable the permanent embedding and protection of Liberty in our system of government.
Thus, a proper appreciation of Constitutionalism is to recognize that it places the legal profession at center-stage with a duty to practice professionalism for the preservation of Liberty writ large.
03a. In American Electric Power v. Connecticut, 564 U.S. 410 (2011), the U.S. Supreme Court described its mid-to-late 20th Century repudiation of its flawed early-20th Century decision in Erie R. Co. v. Tompkins, 304 U. S. 64, 78 (1938) incorrectly denying the existence of federal common law with respect to federal (i.e., national) issues.
03b. Inherent Judicial Power -- Since the beginning of formal recognition/creation of the legal profession, courts have asserted and exercised "inherent judicial power" to regulate the conduct of attorneys. The common-law elements of American jurisprudence as well as U.S. Constitutional jurisprudence in applying the separation-of-powers doctrine have almost* always recognized such inherent judicial power: Since the beginning of formal recognition/creation of the legal profession, courts have asserted and exercised "inherent judicial power" to regulate the conduct of attorneys. Enforcement of such standards is by disciplinary proceedings, which are civil proceedings deemed "sui generis" in nature as a manifestation of the judiciary's inherent power to regulate the conduct of attorneys: The first occasion for the U.S. Supreme Court to recognize such inherent judicial power was Ex Parte Secombe, 60 U.S. 9, 19 How. 9, 15 L.Ed. 565 (1856); the next was in Ex Parte Wall, 107 US 265, 2 S. Ct. 569,27 L. Ed. 552 (1882).] Then Lathrop v. Donahue, 367 U.S. 820, 820 S.Ct. 1826 (1961). However, a recent U.S. Supreme Court action in Fleck v Wetch, 586 U.S. ___ (Dec. 3, 2018)(17-886) vacating and remanding an 8th Circuit decision (on inherent judicial power) and remanding it for reconsideration in light of the Supreme Court's then-recent decision in Janus v. State, County, and Municipal Employees, 585 U. S. ___ (2018), makes it a virtual certainty that issues pertaining to applicability of first-amendment rights to the scope of inherent judicial power will be back before the U.S. Supreme Court within the next several years. The link for the thereby-vacated 8th Circuit decision in Fleck is here.. [*Re "'almost' always" (above), See the "But see" in footnote 03e infra re Goldfarb v. Virginia State Bar.]
03c. https://web.archive.org/web/20180830123359/http://www.shakespeareforalltime.com/tag/kill-the-lawyers/ .
03d. Inherent Judicial Power -- see footnote "03b" above.
03e. Re cross-reference at end of Footnote 03b, above: But see Goldfarb v. Virginia State Bar, 421 U.S. 773, 95 S.Ct. 2004, 44 L.Ed.2d 572 (1975). Goldfarb is one of the rare cases in which the U.S. Supreme Court inexplicably (in my opinion) ignored this principle of inherent judicial power by failing to recognize lawyers' compliance with state Supreme Court Rules regulating their conduct as being exempt from the legislatively regulatory grasp of federal anti-trust laws regarding schedules of customary fees in particular localities. In my opinion, Goldfarb cannot be understood as legal doctrine; rather, to be understood, it must be recognized as political doctrine. Hopefully, someday a case will arise to set Goldfarb adrift on the stagnant waters of overruled (or distinguished-out-of-existence) precedent.
04Quantum Entanglement Exceeding Speed of Light. Imagine that our three-dimensional universe were to be smashed-flat into "flatland" (a hypothetical two-dimensional universe) within which nothing can travel faster than light. On Earth, we apply quantum entanglement to every atom of all materials to be used to construct two identical rockets capable of traveling at the speed of light and then erect them side-by-side on two launch pads. Assume the color of each rocket represents the current state of its quantum entanglement. At launch, the "entanglement" of each Rocket is Blue. We launch one to the Moon. It arrives 1.4 seconds later. The box below represents a third "quantum" dimension in which we can use our quantum-dimension cursor to touch anywhere within the quantum box. The words inside the box represent the relative location and color (i.e., quantum state) of the two rockets. When we use our third-dimension cursor (as a "quantum-state-changer") to only touch "RocketOnEarth" (or only touch RocketOnMoon), both rockets' color changes instantaneously.
05 Although the connotation of "World Wide Web" is that it's synonymous with the "internet," the denotations of both terms are different. The "internet" is the physical structure and commands for it to handle the transmission and receipt of electronic information, but the "World Wide Web" is the hypertext markup language (html) designed by Tim Berners-Lee to enable broader and more sophisticated use of the "internet." For an explanation of the relationship between the "internet" and the "World Wide Web" go here: http://www.w3.org/People/Berners-Lee/FAQ.html#InternetWeb. In 1989, Tim Berners-Lee and Robert Cailliau (scientists at CERN in Switzerland) began designing a proposal for ways to enhance the functionality of such network. Building on the concept of hypertext functionality invented by Doug Engelbart's "mouse," they completed the project by designing "HyperText Markup Language" (html) as programming code enhancing the functionality of the Internet by improving the ways computers could assemble, organize and display data and enable human interaction therewith. When they completed this project n 1990 they named it the "World Wide Web (or "WWW" or "www"). (More at http://www.w3.org/Consortium/history.html.) (See also in the chronology at "1980s" a description of the "Birth of the Internet."