Video & Multimedia
In reply to the discussion: About Mind Reading interview with Michio Kaku on Fox News [View all]aint_no_life_nowhere
(21,925 posts)This is science, not metaphysics. The goal is to develop a human brain-computer interface whereby brain signals can be interpreted and understood by a computer. Some of the research is being carried out at the University of California, Irvine. Called synthetic telepathy, there's already been quite a bit of published research in the area.
http://cnslab.ss.uci.edu/muri/research.html
http://www.nbcnews.com/id/27162401/ns/technology_and_science-science/t/army-developing-synthetic-telepathy/#.UwvCDoV1JeE
This is already working in the lab. Subjects hooked up to an electroencephalograph can communicate individual letters to a computer through thought alone. This successful research was published back in 1988 in the leading journal Electroencephalography and Clinical Neurophysiology (Elsevier Science publication).
http://drfarwell.com/pdf/Farwell-Donchin-1988-Talking-Off-the-Top-of-Your-Head-BCI-brain-computer-interface.pdf
Since then there's been more research including development of the BCI2000 system for people who are completely paralyzed and can only think, with the development of more sophisticated software algorithms that can read their brain waves. The research is not perfected and is ongoing. Eventually, brain waves through a computer interface will be able to move artificial arms and perhaps eventually operate devices at a distance with computer assistance, which is a type of synthetic telekinesis. However, the technology is based on physical, biological, observable, and testable phenomena in the laboratory, not metaphysics or spirituality. Here are a few more recent publications:
Donchin, E., & Arbel, Y. (2009). P300 Based Brain Computer Interfaces: A Progress Report. Foundations of Augmented Cognition. Neuroergonomics and Operational Neuroscience Lecture Notes in Computer Science, 724731.
Farwell, L. A., & Donchin, E. (1988). Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and Clinical Neurophysiology, 70(6), 510523.
Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R. ...Edlinger, G. (2009). How many people are able to control a P300-based brain-computer interface (BCI)? Neuroscience Letters, 462(1), 9498.
Mak, J. N., Arbel, Y., Minett, J. W., McCane, L. M., Yuksel, B., Ryan, D. ...Erdogmus, D. (2011). Optimizing the P300-based BCI: current status, limitations and future directions. Journal of Neural Engineering, 8(2), 17.
Sellers, E. W., Arbel, Y., & Donchin, E. (2012). P300 Event-Related Potentials and Related Activity in the EEG. In Wolpaw, J. R. & Wolpaw, E. W. Brain-Computer Interfaces: Principles and Practice. Oxford, N.Y.: Oxford University Press.
Sellers, E. W., Vaughan, T. M., & Wolpaw, J. R. (2010). A brain-computer interface for long-term independent home use. Amyotrophic Lateral Sclerosis, 11(5), 449455.
And if you look at the video, Kaku is only referring to published, peer-reviewed research such as his allusion to the paper in 1988 I referenced above, where a person hooked up to an EEG machine can concentrate on and highlight letters of the alphabet appearing on a computer screen and communicate that selection through brain waves. I think Kaku is on very solid scientific ground here and you're being a little unfair to him.