Archive
Intelligence and the Human Brain
What is human intelligence? How do you quantify it? These questions need to be addressed before discussing the connection between intelligence and the human brain.
So, first the definition. We all have our own ideas about what intelligence is. To keep things simple, we use the Encyclopaedia Britannica definition: “human intelligence is the mental quality that consists of the abilities to learn from experience, adapt to new situations, understand and handle abstract concepts, and use knowledge to manipulate one’s environment.”
Next, how do you quantify intelligence? Commonly, a combination of standardized tests is used to measure the abilities listed above and more. The results yield a number—the Intelligence Quotient (IQ). This number is what many people are familiar with as a measure of intelligence. However, IQ test results are somewhat influenced by social and cultural factors. Therefore, many researchers also use a measurement called the g-factor (general factor of intelligence). Measurements of additional abilities go into calculating the g-factor such as reasoning, memory, vocabulary, spatial ability, processing speed, and more. Studies have shown that the g-factor is strongly influenced by heredity (biological and genetic factors), but less affected by the environmental factors that influence IQ. Nevertheless, it has been shown that the IQ is a fair approximation of the g-factor, so brain research often involves either or both.
Now, how are intelligence and the human brain related? Using modern brain imaging techniques such as fMRI (Functional Magnetic Resonance Imaging) and PET (Positron Emission Tomography) scans to study brain activity, coupled with IQ and g-factor measurements, researchers are discovering brain characteristics that correlate with intelligence. These characteristics include the amount and distribution of grey matter and differences in neural networking. (1) As one study shows, in individuals with higher intelligence “the areas of the brain which are associated with learning and development show high levels of variability, meaning that they change their neural connections with other parts of the brain more frequently, over a matter of minutes or seconds.” The study goes on to say, “the more variable a brain is, and the more its different parts frequently connect with each other, the higher a person’s IQ and creativity are.” (2)
There are other interesting findings. For example, if one compares groups with different g-factors that solve the same problem, there is much higher brain activity in the people with the lower g-factors than those with higher g-factors. The interpretation is that the less intelligent people require much more brain activity to arrive at the solution. It also was found that when comparing a group of men with a group of women having the same IQ and g-factors, men showed completely different areas of brain activity than women when solving the same problem. This finding provides a clue on how to restore brain functions to people with brain injuries (i.e., by somehow redirecting brain activity through uninjured parts of the brain). (3)
Although the above examples of studies using brain scans are promising, many more years of brain research are expected to be required, first to obtain a much fuller understanding of how the different parts of the brain work together, and then to be able to use that information. If you are interested, Reference 3 provides a good, easy-to-read overview of advances in this field. And for more technical articles on brain networking and intelligence see References 4-7.
Why is all this important? Ultimately, a complete revolution in our way of life could be unleashed by the ability to manipulate brain functions—to repair brain injuries, cure/prevent mental illnesses, and even to make humans more intelligent. One more door, waiting to be opened, with an unknown future on the other side.
- Roberto Colom, Rex Jung, and Richard Haier, “Distributed brain sites for the g-factor of intelligence,” NeuroImage, 31 (2006) 1359-1365, https://static1.squarespace.com/static/538634aee4b0b15c0516a524/t/538774afe4b07a163543ab01/1401386159041/distributed-brain-sites-for-the-g-factor-of-intelligence.pdf
- University of Warwick, “Human intelligence measured in the brain,” com, July 18, 2016, https://www.sciencedaily.com/releases/2016/07/160718110938.htm
- Richard Haier and Rex Jung, “Brain Imaging Studies of Intelligence and Creativity: What is the Picture for Education?” Roeper Review, 30 (2008) 171-180, https://podcasts.shelbyed.k12.al.us/sspears/files/2015/01/Brain-Imaging-Studies-of-Intelligence-and-Creativity-What-is-the-picture-of-Education.pdf
- Michael Ferguson, Jeffrey Anderson, and R. Nathan Spreng, “Fluid and flexible minds: Intelligence reflects synchrony in the brain’s intrinsic network architecture,” Network Neuroscience, 1 (June 2017), no. 2,192-207, https://www.mitpressjournals.org/doi/full/10.1162/netn_a_00010
- Kirsten Hilger, Matthias Ekman, Christian Fiebach, and Ulrike Basten, “Intelligence is associated with the modular structure of intrinsic brain networks,” Scientific Reports, 7 (November 2017), Article no. 16088, https://www.nature.com/articles/s41598-017-15795-7
- Youngwoo Yoon et al, “Brain Structural Networks Associated with Intelligence and Visuomotor Ability,” Scientific Reports, 7 (2017), Article no. 2177, https://www.nature.com/articles/s41598-017-02304-z
- Aron Barbey, “Network Neuroscience Theory of Human Intelligence,” Trends in Cognitive Sciences, 22 (January 2018), no. 1, 8-20, https://www.sciencedirect.com/science/article/pii/S1364661317302218
Human Brain Research: Global Initiatives – An Update
In previous posts we have introduced the topic of brain research, attempted to explain its importance, summarized global initiatives focused on brain research, and described some of the new tools and technologies being used in brain research. In this post, we provide an update on the progress (or lack thereof) being made as a result of the global initiatives.
First, we focus on the “US BRAIN Initiative” that was launched in the spring of 2013. Since then, Congress has appropriated significant and increased levels of funding each year for this initiative. For 2018 this amounts to $400 million. The National Institutes of Health (NIH); working in partnership with government agencies, universities, foundations, and industry; uses this funding to award research grants in seven specific aspects of brain research. Information about funding, the alliances, and summaries of past and current grants can be found on the NIH Web site https://www.braininitiative.nih.gov/. It appears that the US BRAIN initiative is well funded, active, and starting to produce results.
Next, we turn to the European Union’s effort, also launched in 2013 – the “Human Brain Project” (HBP). Here, the news isn’t as positive, as the title of a 2015 article in Scientific American indicates: “Why the Human Brain Project Went Wrong—and How to Fix It. Two years in, a $1-billion-plus effort to simulate the human brain is in disarray…” (1) In a nutshell, the EU awarded $1.3 billion to one neuroscientist as the project leader for one big project – his. And things quickly fell apart. This led to a radical overhaul in management and project structure. As an IEEE article states “The massive €1 billion project has shifted focus from simulation to informatics.” (2) The article goes on to explain: “After a rocky, controversial start, the HBP is now building infrastructure that includes high-performance computing, data analytics, and simulation and modeling software.” But are things better? It’s hard to tell. However, a couple of things are clear. There is significant money available and there are a number of active research projects. Visit the Web site yourself and decide: https://www.humanbrainproject.eu/en/.
Now, an update on the smaller Japanese effort – the “Brain/MINDS Project,” initiated in 2014. A detailed description and interim update was published in 2016 which outlines structure, objectives, projects, and actual funding ($365 million spread over 10 years). (3) More information can be found on the Project’s Web site: http://brainminds.jp/en/. From all indications, the project has been active since 2014 and producing results.
Finally, we turn to China and their “China Brain Project” (announced in mid 2016). Detailed information on this “project” is difficult to find, but there are at least two specific actions:
- In the summer of 2017, China announced the opening of the HUST-Suzhou Institute for Brainsmatics in Suzhou China. With a 5-year budget of $67 million and plans to hire around 120 scientists and technicians, the objective of the Institute is to “make industrial-scale high-resolution brain mapping a standard tool for neuroscience.” (4) The Allen Institute for Brain Science, the Cold Spring Harbor Laboratory in New York, and Stanford University in California have formed partnerships with this new center.
- In March of this year, the Chinese Institute for Brain Research in Beijing was officially established. Around 50 researchers will have laboratories at the new center, and external grants will support around 100 investigators throughout China. The Center will be a partnership between Beijing’s premier biomedical institutions, among them the Chinese Academy of Sciences, the Academy of Military Medical Sciences, Peking University and Tsinghua University. (5)
In addition, other programs and centers around China are being created. Funding appears to be available for these multiple efforts and centers, but finding enough researchers is likely to be a challenge. However, if China is successful in meeting this challenge, they may establish a clear leadership position in this technology area.
So, is understanding the human brain a race or a global partnership? Only time will tell. Your thoughts?
- Stefan Theil, “Why the Human Brain Project Went Wrong—and How to Fix It,” Scientific American, October 1, 2015, https://www.scientificamerican.com/article/why-the-human-brain-project-went-wrong-and-how-to-fix-it/
- Megan Scudellari, “The Human Brain Project Reboots: A Search Engine for the Brain Is in Sight,” IEEE SPECTRUM, June 21, 2017, https://spectrum.ieee.org/computing/hardware/the-human-brain-project-reboots-a-search-engine-for-the-brain-is-in-sight
- Hideyuki Okano et al, “Brain/MINDS: A Japanese National Brain Project for Marmoset Neuroscience,” Neuron 92, November 2, 2016, https://www.cell.com/neuron/pdf/S0896-6273(16)30719-X.pdf
- David Cyranoski, Nature, August 17, 2017, https://www.nature.com/news/china-launches-brain-imaging-factory-1.22456
- David Cyranoski, Nature, April 5, 2018, “Beijing launches pioneering brain-science centre: China’s much-anticipated brain initiative finally starts to take shape,” https://www.nature.com/articles/d41586-018-04122-3
Human Brain Research – The Developing Tools
Innovation. Technology breakthroughs. Interdisciplinary efforts. All of this is providing the opportunity for more scientific and comprehensive brain research. More specifically, the convergence of breakthroughs in biogenetics, nanotechnology, and neuroscience; coupled with advanced microelectronics and data processing; has led to new tools and devices for brain research and understanding. We highlight a few of these to show the possibilities.
First there are advanced imaging technologies that have led to new techniques and instrumentation that is already being used. Short summaries of the most common are provided in a post on psychcentral.com. (1) These include:
- PET (Positron Emission Tomography). PET uses small amounts of radioactive materials injected into the body, a special camera, and a computer to evaluate organ and tissue functions. By identifying changes at the cellular level, PET appears be able to detect which parts of the brain are affected during specific tasks.
- Variations of Magnetic Resonance Imaging (MRI) such as Functional MRI (fMRI) and Diffusion MRI (also called Diffusion Tensor Imaging – DTI). With fMRI the small changes in blood flow that occur with brain activity are measured and mapped. Thus, it is possible to determine which parts of the brain are handling critical functions or to evaluate the effects of stroke or other disease. With DTI the diffusion of water molecules in the brain is measured. Since water molecules within brain tissue tend to diffuse most rapidly along parallel bundles of fibers, this makes it possible to estimate the location, orientation, and anisotropy of the brain’s white matter tracts. In other words, it is possible to measure the pathways and structure of fiber nerve bundles connecting various parts of the brain. This understanding of which part of the brain is connected (or not connected) to which other parts can be used to investigate brain “malfunctions” due to injury or disease.
- Magnetoencephalography (MEG). Instead of measuring electrical impulses, MEG measures magnetic fields outside the head, produced by electrical activity occurring naturally in the brain. Thus, it is possible to produce far more precise and higher resolution images of the brain than before and even to determine the function of various parts of the brain. To do this, very sensitive arrays of magnetometers called SQUIDS (superconducting quantum interference devices developed by quantum physicists) are used. Typically, these sensors are housed in a cooled, helmet-shaped container in which the subject places on their head during testing.
To summarize, the above tools allow researchers to identify the parts of the brain that are active during a specific task or event by showing on a screen the parts of the brain that “light up” under different circumstances. Why is this important? Unlike earlier beliefs, it has now been observed that even relatively simple tasks require the activation of numerous and specific interconnected parts of the brain. Therefore, understanding brain connections and interactions is much more important in addressing brain issues such as injury or dementia than was previously thought.
But these imaging techniques are only a start. Following are a few examples of developing, longer-range possibilities.
- In one example, real time imaging of interactions at the cellular level, coupled with advanced data processing, is being used to reveal patterns of neural activity. Specifically, “Scientists have devised a new system that lets them watch human neurons grown in the lab find and form connections with their signaling partners, an essential process in developing human brains. The processing of “wiring up” is thought to go awry in a number of serious disorders, including autism, epilepsy and schizophrenia – but it’s hard to study.” (2)
- And there is another experimental approach to creating brain wiring diagrams that combines genetic engineering and nanoscale imaging. This technique monitors biofluorescence in insect brains to create maps of the neural connections of the entire brains. In other words, “Scientists have developed new technology that allows them to see which neurons are talking to which other neurons in live, genetically engineered fruit flies.” This technology which traces the flow of information across synapses is called TRACT (Transneuronal Control of Transcription). “TRACT allows researchers to observe which neurons are “talking” and which neurons are “listening” by prompting the connected neurons to produce glowing proteins.” (3)
- And then there is the gene editing technology called CRISPR. This technique has been used to create genetic mutations that have been associated with neurodevelopmental disorders, making it possible to study these “defects” in the laboratory. (4)
- One final example. There is a new, high-sensitivity, laser-based technique that can be used to look inside a person’s skull and measure brain blood flow. This technique, based on Diffuse Correlation Spectroscopy (DCS), is called “interferometric diffusing wave spectroscopy,” or iDWS. “Laser light is shined on the head; as photons from the laser pass through the skull and brain, they are scattered by blood and tissue. A detector placed elsewhere on the head, where the photons make their way out again, picks up the light fluctuations due to blood motion.” (6) The information gathered about blood flow can be used to help patients with traumatic brain injuries and strokes.
As the above examples show, progress is being made rapidly in developing new tools for brain research and understanding. But all of this is just a start. In future blogs we will give additional examples of new techniques, how they are being utilized, and even some results. You are welcome to comment or add to our list.
- Michael Demitri, “Types of Brain Imaging Techniques,” July 17, 2016, https://psychcentral.com/lib/types-of-brain-imaging-techniques/
- Sergiu P. Pasca, “New Technique Lets Researchers Watch Human Brain Circuits Begin to Wire-Up,” July 18, 2017, https://www.bbrfoundation.org/content/new-technique-lets-researchers-watch-human-brain-circuits-begin-wire
- “New technology will create brain wiring diagrams,” California Institute of Technology, January 12, 2018, https://www.sciencedaily.com/releases/2018/01/180112095938.htm
- Michael Talkowski, “Genetic Anomalies Frequently Associated with Neurodevelopmental Disorders Can Now Be Efficiently Recreated in the Lab,” April 11, 2016, https://www.bbrfoundation.org/content/genetic-anomalies-frequently-associated-neurodevelopmental-disorders-can-now-be-efficiently
- “New technology for measuring brain blood flow with light,” University of California – Davis, April 11, 2018, https://www.sciencedaily.com/releases/2018/04/180427144549.htm
The Business Challenges of Globalization
Taken from Creating New Superstars by Carol and Ennio Fatuzzo (1)
In spite of the chaotic world around us, the risks involved in playing some types of games haven’t changed. For example, playing roulette, whether it is Russian roulette or the more civilized version in Monaco, is the same as it always has been. However, in today’s fast paced environment, the “game” of business has become a much more dangerous venture. Developing a new business or expanding an existing one involves a whole new dimension of risk due to many developing “agents of change.” Globalization, including the rapidly expanding global economy and the consequences of global competition, is one of the most powerful influences.
On a positive note, globalization significantly increases potential market sizes, creating extremely attractive and visible business growth opportunities. Just think about the worldwide explosion of smartphones, or the rapid expansion of wine import and export businesses, or the huge potential for new cancer drugs. Even water is now a global opportunity, as the recent history of San Pellegrino shows. Twenty years ago it was a relatively unknown Italian mineral water. Today, it is distributed worldwide to more than 120 countries on five continents. (2) But with size comes different challenges.
Highly visible, big growth opportunities create new global competitors that were never before threats―Korean car manufacturers, Indian software developers, Chinese computer and internet-based companies, and more. The bottom line: More companies around the world are likely to be pursuing the same specific growth opportunity at the same time. Therefore, the risk of failure for any single company is high—significantly higher than in the past.
Looking at the situation another way, in the past a company had a reasonable possibility of being the only one pursuing a good new opportunity—one that was unrecognized by others. And that resulted in many single-company big successes: Kodak and silver halide film, IBM and computers, Motorola and cell phones, RCA and consumer electronics, and more. But in today’s dynamic global economy, due to the growing technical sophistication of global competitors and the faster pace of everything, there will not be many “lone pioneers.”
And there is another kind of challenge. Pursuing larger global opportunities requires greater resources than what are needed to be successful with smaller, “local” opportunities. This results in the financial risk being much higher, sometimes high enough to place an entire company in jeopardy. If a global project fails, for whatever reason, that failure is extremely costly. Kodak having to declare Chapter 11 bankruptcy as a result of its late and unsuccessful attempt to become a major global player in digital photography is a good example of today’s high cost of failure. (3)
But the risk of large financial investments isn’t the only challenge involving resources. In the past, under-resourcing a project or using resources ineffectively did not matter as much as it does today. Why? Any such “mistakes” will slow down progress; and in a faster paced and more competitive business world, this decreased speed will almost certainly create a significant competitive disadvantage. This, in turn, greatly increases the probability of a costly failure in the marketplace.
And finally, because of today’s need for speed, failure is not only is connected to making wrong or bad decisions. It frequently is the result of making good decisions too slowly. Kodak’s eventual management decision to pursue digital photography was a good one, but the delay in making that decision was a major contributor to the effort’s failure. This delay gave global competitors an insurmountable lead.
Bottom line, in a highly competitive, global business environment that is rapidly changing, a slow decision will almost always be a wrong decision; and being late to the market almost always assures failure. Just as in nature, a slow company will become prey for the faster, more aggressive one.
1. Ennio Fatuzzo and Carol L. Fatuzzo, Creating New Superstars: a Guide to Businesses that Soar above the Sea of Normality (USA: September 2016). Available from amazon.com.
2. S. Pellegrino Company Website, accessed October 18, 2017, https://www.sanpellegrino.com/us/en/company-intl-41.
3. Rick Newman, “Four Lessons from Kodak’s Comedown,” U.S. News online, January 19, 2012, http://www.usnews.com/news/blogs/rick-newman/2012/01/19/4-lessons-from-kodaks-comedown; “The last Kodak moment?,” The Economist, January 13, 2012, accessed online October 18, 2017, http://www.economist.com/node/21542796.
Big Data and YOU: The Promises and the Concerns
by Carol L. Fatuzzo and Ennio Fatuzzo
BACKGROUND
In our book “Creating New Superstars” (1) and in a previous blog, we focused on Big Data and Business. Now we take a brief look at the more personal side of Big Data: the reality, the promises, and the concerns.
As you must be aware, today vast quantities of data about people (including you) and their interactions with the outside world are being accumulated at unprecedented rates and stored in digital form. This rapidly increasing, already huge, storehouse of personal information is part of what is known as “Big Data.”
Where is this personal information coming from? Everything we do online, such as shopping and banking, leaves a record. But there are a growing number of other sources: social media, google, smartphones and other smart devices, electronic medical records, military and government data bases, surveillance cameras, and much more.
Collecting and storing personal Big Data digitally has become easy and is pervasive, but it is only the beginning. For this vast amount of information to be useful, there must be the ability to access the data rapidly and reliably; and there must be tools that can quickly analyze an immense amount of seemingly unrelated information, and make useful connections. And all of this is now reality. Faster and more powerful computers coupled with software advances (e.g., “artificial intelligence”) are rapidly opening doors to new analytic capabilities.
USES OF PERSONAL BIG DATA
Many large companies and organizations already have access to the growing collection of personal Big Data and are taking advantage of the advanced analytic capabilities. Common examples are targeted marketing and credit checks. And this is only the beginning. There are many less obvious ways personal Big Data is starting to impact your everyday life, including tracking your physical activities and location and even determining choices offered to you in bars and restaurants (2, 3).
Another growing use of Big Data is in sports. Not only individual players’ moves, but entire game strategies can be analyzed to improve players’ performances and/or game strategies. And then there are the fans. Analyzing fan generated Big Data is leading to techniques for generating stronger fan support and providing extra (and more profitable) event-based services. (4)
Then there is the healthcare segment. Here collection and analysis of personal Big Data is already leading to major advances such as improvements in healthcare outcomes (including saving lives), remote patient monitoring and real-time alerting, more cost-effective treatments, programs to prevents opioid abuse, accelerating cancer research, providing access to the latest treatments being tested, and much more. (5,6)
What we have described so far is only the beginning. To repeat, sources of personal Big Data are exploding (GPS tracking, wellness monitoring, surveillance of financial transactions, facial recognition, education records….) as are the capabilities for sophisticated analysis and uses for this data. And yes, the collection and use of personal Big Data has many positives. There is no question that the future benefits arising from the combination of big data and advanced analytics will be immense.
THE CONCERNS
But there is a downside. As summarized by McKinsey and Company: “Privacy issues will continue to be a major concern. Although new computer programs can readily remove names and other personal information from records being transported into large databases, stakeholders across the industry must be vigilant and watch for potential problems as more information becomes public.” (5)
But there is an even more serious concern. So far, we have focused on the use of personal Big Data by businesses and other private or public organizations. It is an entirely different situation when governments enter the arena. A number of articles have raised the concern about Big Data in the hands of government evolving into “Big Brother.” Following we repeat one example of this from our January blog “Big Data: An Exploding Agent of Change.”
Recent articles have focused on a data collection and analysis project being run by the Chinese communist party to develop what they call a “social-credit system.” (7, 8) To summarize, using Big Data technologies, the project’s objective is to develop a system to collect and categorize as “good” or “bad” all available information for each individual citizen. Ultimately, rewards for good behavior (e.g., prizes, better housing) and punishments for bad behavior (e.g., denial of permissions to travel or access to loans and services) would be handed out—all this aimed at improving the allegiance of citizens to the State.
Will China be successful? How far will other governments go towards using Big Data to become “Big Brother” watching over each citizen? Certainly, these are valid concerns. And for those who watch the Television series “A Person of Interest,” it may occur to them that the project described above is much more dangerous than the situation portrayed by the TV series. The latter only monitors each person in real time, but the Chinese scenario not only does this but also builds a history of everything each citizen has done and uses that information for its own purposes.
Yes, the growing availability and use of personal Big Data presents serious concerns. However, keep in mind that every breakthrough new technology has the potential for both good and bad. It all depends on the intentions of those who develop and apply the technology.
REFERENCES
1. Ennio Fatuzzo and Carol L. Fatuzzo, Creating New Superstars: A Guide to Businesses that Soar above the Sea of Normality (USA: September 2016). Available for purchase from amazon: http://amzn.to/2hAn6dy.
2. “Big Data in Our Everyday Life,” February 10, 2017, Nordic-IT, https://nordic-it.com/big-data-everyday-life/
3. Mona Lebied, “5 Big Data Examples in Your Real Life At Bars, Restaurants, and Casinos,” Business Intelligence, May 4th 2017, http://www.datapine.com/blog/big-data-examples-in-real-life/
4. “Big Data in Sports: Going for the Gold,” inside BIGDATA, June 4, 2017, https://insidebigdata.com/2017/06/04/big-data-sports-going-gold/
5. Basel Kayyali, David Knott, and Steve Van Kuiken, “The Big-Data Revolution in US Health Care: Accelerating Value and Innovation,” McKinsey & Company, http://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/the-big-data-revolution-in-us-health-care
6. Mona Lebied, “9 Examples of Big Data Analytics in Healthcare that can Save People, Business Intelligence, May 24th 2017, http://www.datapine.com/blog/big-data-examples-in-healthcare/
7. Jamie Condliffe, “China Turns Big Data into Big Brother,” MIT Technology Review, November 29, 2016, https://www.technologyreview.com/s/602987/china-turns-big-data-into-big-brother/
8. “China invents the digital totalitarian state: The worrying implications of its social-credit project,” The Economist, December 17, 2016, https://www.economist.com/news/briefing/21711902-worrying-implications-its-social-credit-project-china-invents-digital-totalitarian
Superstar Technologies for Superstar Companies
TECHNOLOGY LAUNCHING PADS – AN INTRODUCTION by Carol L. Fatuzzo and Ennio Fatuzzo
Yes, our book “Creating New Superstars” is a guide for achieving extreme business growth. And yes it addresses topics such as business creativity and brilliant leadership. But it focuses on explosively developing new technologies and their power. Why? A superstar company’s exponential growth requires exponential change in the technology on which the business is based. And today, for the first time in history, the explosion of advances in Microelectronics, the Internet, and Biogenetics offer this possibility. We call these technologies “Launching Pads.”
These three technology Launching Pads, alone and/or in combination, are changing our world and creating new high growth business opportunities at unprecedented rates. Thus, in our fast-paced, technology-rich environment, it is impossible to ignore these technology-based forces that are shaping the future for business and for humanity.
MICROELECTRONICS
Consider Microelectronics. Is Microelectronics-based technology (integrated circuits) THE basic Launching Pad? It has given birth to or at least enabled our other two Launching Pads, so at the very least it is a basic building block for them. Think about it. The Internet wouldn’t have the enormous impact on the world it does today without the rapid increases in speed and data handling enabled by advances in Microelectronics. And Biogenetics would not be making its radical breakthroughs without the advanced computers and digital equipment based on Microelectronics that it uses as tools.
However you look at it, Microelectronics has created an ongoing revolution. It is pervasive and changing the world as we know it due to rapid advances in the technology. But the advances in Microelectronics are not just rapid. They are being made at exponentially increasing rates as the doubling of microprocessor capabilities roughly every two years for the past several decades show. The resulting rapidly shrinking size of integrated circuits and the increased number of these tiny devices fitting on smaller and smaller chips has resulted in dramatic increases in computer processing speeds, data storage capacities, and more—much more. These radical improvements in digital electronics have irreversibly changed nearly every segment of the global economy.
However, keep in mind that nothing is forever. If we had to make a prediction, we would pick the younger and more embryonic Biogenetics Launching Pad to be the successor to Microelectronics in the not-too-distant future.
BIOGENETICS
What is Biogenetics? Today, the process commonly used in Medical Biotechnology is genetic engineering. Genetic engineering refers to scientific procedures that allow the direct manipulation of genetic material to alter the hereditary traits of a cell, organism, or population. Hence the name Biogenetics. Today’s blockbuster drugs and the superstar businesses that have commercialized them are the basis of this newest technology Launching Pad.
But this is just the beginning. Breakthrough advances in Biogenetics are being made at an ever-increasing pace. Four of these emerging and rapidly developing areas—cloning of genes and organisms, stem cell research, and the genome editing technologies of TALENs and CRISPR—appear to us to be the most promising. Clearly, although still in its infancy, Biogenetics is a technologically rich area with perhaps an even greater potential than Microelectronics.
NANOMEDICINE
But what about new Launching Pads? As an example, we turn to a rapidly advancing area of science and engineering—the field of Nanotechnology. Nanotechnology is a very broad area of research involving dimensions less than 100 nanometers, but much of its promise is in the future. However, we believe that Nanomedicine, the rapidly advancing Life Science-based segment of Nanotechnology, has the potential to be a new Launching Pad on its own in the near term.
Simply stated, Nanomedicine is the application of Nanotechnology to medicine. It involves the monitoring, repair, construction and control of human biological systems at the molecular level, using engineered nanodevices and nanostructures. It ranges from the medical applications of nanomaterials to nanoscale biosensors and even to possible applications of programmable nanomachines and nanorobots—devices that would allow medical doctors to execute procedures in the human body at the cellular and molecular levels.
Nanobiosensors for measuring glucose, heart rate, blood pressure, etc. Injectable, wireless nanobots that carry out medical tasks, gather diagnostics and even deliver drugs into the bloodstream. Self-assembled, DNA based nanodevices for molecular scale diagnostics and smart drug delivery. Quantum wires for real-time sensing of biomarker proteins for cancer. Nanorobots for repairing damaged tissue, unblocking arteries, and replacing damaged organs. And the list goes on and on. Nanomedicine technology possibilities are endless and world-changing.
NEW SCIENCE FOR NEW LAUNCHING PADS
And what about technology Launching Pads in the longer term? There are many possibilities, including current areas of research such as Complexity Science, Subatomic Particles, the Makeup of the Universe, and the Search for Life beyond Earth. However, the path from a scientific discovery to a Launching Pad is long. So, the only certainty is that there will be new technology Launching Pads, and they will change the world.
A CAUTION
We are rapidly approaching a technology treasure room with many doors. Beyond each door, there is the potential for both great good and great harm. Only two things are certain. Which doors we open and when will determine the future of humankind and life as we know it. And once a door is opened, it can never be closed. Everything will change forever.
Should we open these doors? Will we? The answer to the last question is simple: Yes, because humans always have and always will.
For more detailed and easy-to-understand information on the technologies highlighted above and their impacts, see “Creating New Superstars” by Carol L. Fatuzzo and Ennio Fatuzzo, available from amazon: http://amzn.to/2hAn6dy
Big Data: An Exploding Agent of Change
Big Data: An Exploding Agent of Change
by Carol L. Fatuzzo and Ennio Fatuzzo
Today, thanks to the internet, many kinds of data (Variety) are being sent, received, and accumulated at unprecedented rates (Velocity) in unprecedented quantities (Volume). So, how can we manage this rapidly increasing amount of data and benefit from them? How can we discover hidden patterns and reveal unknown correlations?
Storing such massive quantities of data is only the beginning. To be useful we also must be able to access and analyze them rapidly and reliably. Following is an excerpt from our latest book, “Creating New Superstars,1” which addresses this opportunity.
BIG DATA AND BUSINESS ANALYTICS
“Business analytics refers to “the extensive use of data, statistical and quantitative analysis, explanatory and predictive (computer) models, and fact-based management to drive decisions and actions.2 The rapid development and adoption of advanced business analytics technologies is already altering the business landscape.
Big data refers to data sets too large for traditional data processing. These data sets have the potential for “huge new benefits—but also heartaches.3 The explosive emergence and availability of such huge, fast-changing, unstructured data from various old and new sources, mostly external to a business, and attempts to analyze them, has created the “age of information” ― an age where knowledge is power. But in many companies these unwieldy data sets have also created an “analysis bottleneck” that limits their usefulness.
But now it is possible to combine big data with advanced business analytics. Unparalleled and real-time access to vast quantities of data and the ability to rapidly analyze them in meaningful ways are already realities. Business management is being challenged with the rapidly growing technical capability of harnessing the vast potential that is hidden in multiple sources of massive data/information.
Today many companies already are analyzing big data to achieve significant competitive advantages―to improve products and services, cut costs, attract repeat customers, and more. An IBM Global Business Services Executive Report documents several big successes: “Companies like McLeod Russel India Limited completely eliminated systems downtime in the tea trade through more accurate tracking of the harvest, production and marketing of up to 100 million kilos of tea each year. Premier Healthcare Alliance used enhanced data sharing and analytics to improve patient outcomes while reducing spending by $2.85 billion. And Santam improved the customer experience by implementing predictive analytics to reduce fraud.4
Still embryonic though, are advanced analytical methodologies that can be applied to big data to build useful models for predicting and optimizing future outcomes. Such tools would enable leaders to make better decisions and make them faster and with lower risk; and might even help scientists make fundamental discoveries. This is the promise of the emerging field of data science, the marriage between big data and advanced analytics, the former providing the information, the latter supplying the tools that can be applied to that information to develop insight and guide action.5 However, there is one giant caution for business leaders. Big data and analytics, no matter how sophisticated and expertly used, will not replace or necessarily even predict disruptive innovations. Analyzing the past and extrapolating to the future is not likely to accurately predict a future shaped by unparalleled disruptive and exponential change.”
A CAUTION AND A CONCERN
There is no question that the future benefits arising from the combination of big data and advanced analytics will be immense, but not everything is positive. For example, even with advances in analytics technology, including artificial intelligence, keep in mind the caution expressed above:
“Analyzing the past and extrapolating to the future is not likely to accurately predict a future shaped by unparalleled disruptive and exponential change.”
And for those who worry that big data collection may infringe into their privacy: Yes, large companies and organizations already have access to a lot of personal data and are using it. On the positive side, this is already leading to things such as improvements in healthcare outcomes and understanding new market trends for better business management.
But it is an entirely different situation when governments enter the arena. A number of articles have been written about “big data” in the hands of government evolving into “big brother.” A recent article in the Economist entitled “China invents the digital totalitarian state: The worrying implications of its social-credit project6” illustrates a concerning example.
The article describes a data collection and analysis project being run by the Chinese communist party to develop what they call a “social-credit system.” To summarize, using “big data” technologies, the project’s objective is to develop a system to collect and categorize as “good” or “bad” all available information for each individual citizen. Ultimately, rewards for good behavior (e.g., prizes, better housing) and punishments for bad behavior (e.g., denial of permissions to travel or access to loans and services) would be handed out – all this aimed at improving the allegiance of citizens to the State.
Will China be successful? How far will other governments go towards using big data to become “big brother” watching over each citizen? Certainly valid concerns. However, keep in mind that every breakthrough new technology has the potential for both good and bad. It all depends on the intentions of those who develop and apply the technology.
- Ennio Fatuzzo and Carol L. Fatuzzo, Creating New Superstars: a Guide to Businesses that Soar above the Sea of Normality (USA: September 2016)
- Thomas H. Davenport and Jeanne G. Harris, Competing on Analytics: The New Science of Winning (Boston: Harvard Business School Press, 2007), 7.
- “Data, data everywhere,” The Economist, Feb 25th 2010, http://www.economist.com/node/15557443.
- Michael Schroeck, Rebecca Shockley, Dr. Janet Smart, Professor Dolores Romero-Morales, and Professor Peter Tufano, “Analytics: The real-world use of big data,” IBM Global Business Services Executive Report, IBM Institute for Business Value (2012), accessed June 27, 2016, http://www-935.ibm.com/services/us/gbs/thoughtleadership/ibv-big-data-at-work.html.
- Foster Provost and Tom Fawcett, “Data Science and its Relationship to Big Data and Data-Driven Decision Making,” Big Data, 1, no. 1 (March 2013), 51-59, http://online.liebertpub.com/doi/pdfplus/10.1089/big.2013.1508.
- “China invents the digital totalitarian state: The worrying implications of its social-credit project,” The Economist, Dec 17, 2016, http://www.economist.com/news/briefing/21711902-worrying-implications-its-social-credit-project-china-invents-digital-totalitarian
Artificial Intelligence: A Door to the Future
Artificial Intelligence: A Door to the Future (And the Future is Now)
by Ennio Fatuzzo and Carol L. Fatuzzo
“2016: The year artificial intelligence exploded.” This is the title of a recent article in the SD Times that begins like this:
“Artificial intelligence isn’t a new concept. It is something that companies and businesses have been trying to implement (and something that society has feared) for decades. However, with all the recent advancements to democratize artificial intelligence and use it for good, almost every company started to turn to this technology and technique in 2016.”1
The article goes on to give examples of recent developments by Facebook, Microsoft, Google, and IBM. These are interesting, but only touch the surface of this rapidly developing technology area.
A December article in the New York Times Magazine2 does more to open one’s eyes to progress and competition in this explosive area of technology which has developed mostly “under the radar.” The title of the Times’ article is informative: “The Great A.I. Awakening: How Google used artificial intelligence to transform Google Translate, one of its more popular services — and how machine learning is poised to reinvent computing itself.” The article tells the story of how Google formed a new department (Google Brain) to focus on artificial neural networks and how that led to the radical transformation of their machine translation platform.
However, the Times article does more than focus on Google’s advances. It considers some of the broader issues associated with A.I. In the author’s own words:
“Google’s decision to reorganize itself around A.I. was the first major manifestation of what has become an industry wide machine-learning delirium. Over the past four years, six companies in particular — Google, Facebook, Apple, Amazon, Microsoft and the Chinese firm Baidu — have touched off an arms race for A.I. talent, particularly within universities… What is at stake is not just one more piecemeal innovation but control over what very well could represent an entirely new computational platform: pervasive, ambient artificial intelligence.”
Pervasive, ambient artificial intelligence— the author’s words. But is that in the future, or is it now? Virtual assistants are everywhere. And Google Brain is only one example of the race to develop more and more products that parallel human intelligence, not only in memorizing data, but also in following instructions. Many companies are now working on machines that can self-instruct on how to reach pre-determined goals. Apple, Facebook, Amazon, Microsoft and Baidu, a Chinese company, are all developing such types of products. A subsidiary of Samsung, a Japanese company, announced a machine-enhanced detection of breast cancers. And the list goes on.
But now consider a broader perspective. In the end, will all this effort on A.I. result in good for humanity or something else? For example, as we state in our recent book “Creating New Superstars”:
“We haven’t even mentioned numerous other technology advances with the potential for both great good and great harm such as robots with advanced artificial intelligence capable of learning and redesigning themselves and potentially acting independently from the humans that are supposed to be controlling them.”3
Consider what we said: robots with artificial intelligence, thinking and acting independently of the humans who control them. Such robots could theoretically be capable of redesigning themselves, or of designing and building computers or other robots better than themselves. And maybe these “super machines” could even rebel against the humans that originally created them.
Intelligent robots and other advanced artificial intelligence applications and devices may seem like science fiction, but new capabilities in this disruptive technology area are arising at an ever-increasing pace. This door to our future is rapidly opening, but what waits on the other side? Is it good or bad for humanity? According to experts such as Stephen Hawking and Bill Gates, artificial intelligence poses significant threats.4
But all major scientific and technology advances offer possibilities of great good or evil for humanity. Today:
“We have reached a room with many doors. Behind each door, there is a different future for us and our world. Should we open these doors? Do we want to? Will we? The answer is simple: Yes, because humans always have and always will.”5
And, as we open each new door, it is up to us to follow the new paths carefully and wisely.
References
- Christina Cardoza, “2016: The year artificial intelligence exploded,” com, December 26th, 2016, http://sdtimes.com/2016-year-artificial-intelligence-exploded/
- Gideon Lewis-Kraus, “The Great A.I. Awakening: How Google used artificial intelligence to transform Google Translate, one of its more popular services — and how machine learning is poised to reinvent computing itself,” com, December 14, 2016, http://www.nytimes.com/2016/12/14/magazine/the-great-ai-awakening.html?smprod=nytcore-ipad&smid=nytcore-ipad-share
- Ennio Fatuzzo and Carol L. Fatuzzo, Creating New Superstars: Businesses that Soar above the Sea of Normality (USA: September 2016) p. 261. Available from amazon: http://amzn.to/2hAn6dy
- James Barrat, “Why Stephen Hawking and Bill Gates are Terrified of Artificial Intelligence,” com, April 9, 2015, http://www.huffingtonpost.com/james-barrat/hawking-gates-artificial-intelligence_b_7008706.html; Eric Mack, “Bill Gates Says You Should Worry About Artificial Intelligence,” Forbes online, January 28, 2015, http://www.forbes.com/sites/ericmack/2015/01/28/bill-gates-also-worries-artificial-intelligence-is-a-threat/#684260ef3d10
- Ennio Fatuzzo and Carol L. Fatuzzo, Creating New Superstars: Businesses that Soar above the Sea of Normality (USA: September 2016), p. 259. (Available from amazon: http://amzn.to/2hAn6dy)