Interestingly, a 2007 clip from NBC News has, for some reason, been making the rounds on the Internet once again as if it were being broadcast for the first time.
This clip and the accompanying commentary has served to galvanize those reading and listening to it into one of two camps: one that is subject to overhype, “clickbait” titles and panic; and another that is firmly entrenched in the American sheep class for whom no bad news or reports of malfeasance is real or anything other than a conspiracy theory.
This is because the NBC clip is an issue that would affect each and every individual in the country on a deeply personal level.
Entitled, “The Year 2017,” the report predicts a near future in which every American will be micro-chipped by the year 2017:
Yet, while such a report may have caused consternation amongst the general public even as recently as 2007, much has changed in the last seven or so years.
The year 2015 brings a new kind of American, one that is largely unconcerned with anything other than the most basic needs of sustenance and entertainment.
While, in 2007, Americans who were willing to engage the ruling class in the idea of micro-chipping humans were still a distinct minority, the seven years after the report has seen a major increase in the population of individuals who are indeed willing to consider the possibility.
But, while those actually concerned about the idea of a micro-chipped population are often bombarded with misinformation such as rumors that the AHCA contains provisions to forcibly implant microchips, the concerns over micro-chipping (mandatory and otherwise) are very real. Indeed, not only does the technology exist, it is currently being implemented. Indeed, the goal is to gradually expand the frequency and ubiquity of such technology until it becomes the norm and, eventually, a procedure enforced by law.
Biometrics Already Used
The video clip mentioned above is focused entirely upon the concept of biometric methods/micro-chipping being used for the purposes of identification and financial transactions. As the report mentions, forms of biometric identification for payment and ID have been used for years, with relatively warm reception amongst the general public despite the fact that the majority have yet to join the bandwagon.
Regardless, such technology has expanded at a rapid pace over the last ten years, intensifying its presence over the last five especially. For instance, in June 2012 it was announced by Homeland Security News Wire that researchers at the Biometric Technologies Laboratory at the University of Calgary have improved upon current commercially available biometric identification technologies to the point of creating a form of artificial intelligence capable of making decisions regarding biometric information received from a variety of different sources.
The new biometric security program works by simulating the “learning patterns and cognitive processes of the brain.” The system was developed by the research and application of “neural network-based models for information fusion.”
In an article published in the Sunday Telegraph on December 4, 2011, Rosie Squires describes another biometric scheme and how many Australian employers are introducing a fingerprinting program in order to monitor employees and “save costs.”
The new fingerprint scanners will be taking the place of time clocks, trust, responsible hiring, and, apparently, competent supervisors. No longer will the employees of companies such as Qantas, Dan Murphy’s, Breville, and Unomedical be able to clock in and out of work in the traditional manner. In order to prevent employees from “arriving late or slacking off,” the workers will now be forced to render some of their most private information to their employer via the new scanners.
The new technology, PeopleKey, will be used not only to clock employees on their way in and out, but also to monitor their progress over the course of the workday, as well as other potential incidents of “slacking off” like using the bathroom or daring to engage a fellow employee in conversation.
Vein scanners are another biometric technology being introduced. In an article published on August 8, 2011 in Technology Review, entitled “Beyond Cell Phone Wallets, Biometrics Promise Truly Wallet-Free Future,” explains that major corporations are not even waiting for the “digital wallet” to catch on.
They are actually moving forward with a system that will allow for an individual to swipe their palm, not their phone, in front of a digital recognition device in order to gain access to various buildings, pay for merchandise, or otherwise identify oneself.
In addition a new system, known as PalmSecure, requires no tangible hardware on the part of the user, so phones are not necessary. All it requires is that the user wave his/her hands in front of an electronic reader and the small device reads the unique pattern of veins by way of near-infrared light. New York University’s Langone Medical Center has already implemented the vein scanners in some of its medical facilities.
Health histories, insurance forms, and other documents are all handled electronically and at a much faster pace with the help of the new vein scanners.
As Jonathan Allen reports for Reuters:
The initial set-up for a new patient takes about a minute, the hospital said, while subsequent scans only take about a second.
“We can then just ask one question: ‘Has your insurance changed?’ Birnbaum [Bernard Birnbaum, the vice dean and chief of hospital operations for the center] said. If ‘no,’ you don’t have to fill out a single form.”
Although NYU’s program has received more attention than most of the other vein scanner stations, it is important to remember that it is not the only one of its kind. It is merely one of the first to be implemented at a hospital in the Northeast. Several other hospitals have already introduced the system and more will likely follow.
“to identify students and thereby reduce waste and the threat of impersonation.”
With the new scanners, the students are able to have their meals deducted from their account, upon scanning their palms, as they march single file in the feeding lines during lunch time. Of course, this type of technology is not new to Pinellas County. The students have been finger scanning in order to gain access to their lunch for years.
Google, DNA screening companies, and other technology firms are not the only institutions attempting to gain access to DNA material. In fact, one of the most disturbing groups (outside government itself) that have embarked on a mission to acquire DNA samples is the Grand Lodge of Freemasonry.
Known as MasoniChip, the program is openly administered by the Grand Lodge and is operated with the support of governments in both the United States and Canada. Indeed, MasoniChip has received so much support from the government sector that many have been duped into believing that it is merely a government program being supported by the Masons even though the reality is actually the opposite.
MasoniChip promoters set up fairs, advertise the program through local school districts and enter into partnerships with local law enforcement. In typical form, the mainstream media also promotes the program and the organization, which apparently has possession of its own police dog, Mason.
For those who may be in the dark as to what MasoniChip is, Amy MacPherson of the Huffington Post describes the program is this manner:
It begins on the surface as a child identification project, in case your loved ones are ever to be horrendously abducted. Parents are familiar with at-home kits to record their kids’ vital information, for protection against the greatest of all fears to be inflicted on a family.
Normally height, weight, hair and eye color are recorded, along with a set of fingerprints and hopefully a current photograph. It’s just the good folks at your local Masonic Lodge saw fit to take things further.
With advances in technology, they began to offer digital fingerprints, digital imaging, digital video, dental impressions and DNA mouth swabs. This data processing is managed by their proprietary software that’s designed to be compatible with local and national law enforcement.
This is after all, a campaign created by police in the brotherhood regardless of its private funding.
Yet, for all the conflation between the Masonic program with government involvement, the truth is that the program is entirely private – meaning it belongs to the Grand Lodge.
The Freemasons’ own website clearly states that this is the case by writing,
We the Freemasons are the sole “sponsor” of the Masonic Safety Identification initiatives as developed in our various Masonic Grand Lodge Jurisdictions.
As such we schedule the Events and coordinate the equipment, materials and volunteers necessary to conduct events.
All groups and individuals are welcome to work alongside, but they are not referred as sponsors but listed and involved as “supporters”, “supporting partners”, “corporate partners”, “in collaboration with”, or “in cooperation with.”
MasoniChip states that, in addition to recording the children’s data themselves, it will provide its own “health care professionals” to collect the DNA samples at whatever event the DNA gathering is scheduled to occur.
As MacPherson writes,
There is no way to guarantee what happens behind closed doors and although they claim to delete sensitive information (the Canadian website states “No information is ever stored by the MasoniChIP program”), any computer savvy person knows that clicking an “x” isn’t permanent unless you format the entire system.
Parents are asked to trust an intriguing, private fraternity; to ensure that quality standards are met and family privacy is legally respected without any kind of oversight.
Because Freemasons fund 100 per cent of the initiative, there is no opportunity to discuss issues regarding data ownership or how they feel about those technicalities in the privacy of their meetings.
With somber scrutiny and if further tragedy struck, authorities would match remains with parental samples for definitive confirmation.
It is the parents’ DNA that could aid in matching the unnamed, but only accredited laboratories are permitted to conduct the process. Whether a parent or child, collecting DNA cannot occur at an open park event, run by stranger volunteers and become admissible to the national database.
It is with great sadness for grieving families that we must note the Freemason project is not supported by government DNA databases.
Overall, MacPherson accurately concludes that,
“the most controversial component of the MasoniChip undertaking is not recognized for the purpose they advertise and state to parents.”
Of course, regardless of the stated reasons for acquiring the DNA samples, a massive DNA database is being created.
The MasoniChip program itself has registered over 1.5 million children by the end of 2012 and is apparently going to be extended to seniors and the disabled in the near future.
In the conclusion to her article, MacPherson asked the question,
“And why is the face of government through public schools or police through public events, being placed on an effort from private organizations to mislead parents?”
The answer, of course, is clear so long as one is not concerned with being labeled a conspiracy theorist. Yet the programs and technologies mentioned above are by no means the pinnacle of biometric application that contain a heavy element of cashless society promotion and data retention on the part of the security state surveillance apparatus.
For instance, Nicholas West of Activist Post has extensively detailed biometrics-related technology as well as the creation and implementation of AI (Artificial Intelligence), surveillance technology, and even electronic-technological mind control technology developed by the corporate/government weapons industry and DARPA-style development industries.
Similarly, the question of the existence and implementation of microchips for tracking, tracing, identification, and other purposes is not merely an concept scheduled for deployment in the future, it is one that is already here. In fact, the practice of implanting microchips for both identification and tracking has been around for quite some time.
Earlier this year, an office building in Stockholm, Sweden began implanting its workers with microchips for the purpose of identification, to allow staff members to interact with smartphones, open doors, gain access to specific locations, and operate equipment.
In doing a report for BBC on the Epicenter building which houses offices for Google and Microsoft, Rory Cellan-Jones volunteered to be micro-chipped and had a chip embedded in his hand.
Cellan-Jones was not the only BBC reporter to have been implanted with a microchip. In 2004, another BBC reporter, Simon Morton, was embedded with a microchip while working on a report about a Barcelona nightclub that uses microchips for VIP admittance and accepting payment for drinks.
The same year, it was reported by numerous mainstream media organizations that at least 160 Mexican government employees were implanted with microchips for “security reasons,” with plans to implant even more in the coming months.
As Will Weissert of Associated Press reported,
Security has reached the subcutaneous level for Mexico’s attorney general and at least 160 people in his office — they have been implanted with microchips that get them access to secure areas of their headquarters.
It’s a pioneering application of a technology that is widely used in animals but not in humans.
Mexico’s top federal prosecutors and investigators began receiving chip implants in their arms in November in order to get access to restricted areas inside the attorney general’s headquarters, said Antonio Aceves, general director of Solusat, the company that distributes the microchips in Mexico.
Attorney General Rafael Macedo de la Concha and 160 of his employees were implanted at a cost to taxpayers of $150 for each rice grain-sized chip.
More are scheduled to get “tagged” in coming months, and key members of the Mexican military, the police and the office of President Vicente Fox might follow suit, Aceves said.
Fox’s office did not immediately return a call seeking comment.
A spokeswoman for Macedo de la Concha’s office said she could not comment on Aceves’ statements, citing security concerns. But Macedo himself mentioned the chip program to reporters Monday, saying he had received an implant in his arm.
He said the chips were required to enter a new federal anti-crime information center.
“It’s only for access, for security,” he said.
The chips also could provide more certainty about who accessed sensitive data at any given time. In the past, the biggest security problem for Mexican law enforcement has been corruption by officials themselves.
Weissert also reported that over 1,000 Mexicans had them implanted for medical reasons.
The CEO of Applied Digital Solutions, Scott Silverman, also revealed that ADS had sold around 7,000 chips and that a sizable number had been implanted in individuals for security or identification purposes.
In 2002, a family in Boca Raton, Florida was voluntarily implanted with microchips in order to provide a more rapid response to medical emergencies and because they were concerned about terrorism after 9/11.
As USA Today reported,
A Florida family on Friday became the first to be implanted with computer chips that researchers hope will someday become an easy way to provide emergency room staffers with patients’ medical information.
Jeff and Leslie Jacobs, along with their 14-year-old son, Derek, had the tiny chips implanted in their arms. Each chip is about the size of a grain of rice, and insertion takes about a minute under local anesthesia.
The chips, called the VeriChip, were designed by Palm Beach-based Applied Digital Solutions. They are similar to chips implanted in pets to identify them if they are lost.
The family wanted the implants in case of future medical emergencies.
In 2011, CityWatch, a Cincinnati, Ohio-based firm became the first US business to use microchips in its employees.
For years, many have mocked the idea of implantable microchips and cyborgs as both conspiracy theories and science fiction. Anyone who so much as mentioned these possibilities to their neighbor risked being labeled either as a religious fanatic or delusional and paranoid. However, as they have become more and more prevalent in everyday society, it has become increasingly difficult to ridicule these concepts.
For instance, with stories like the Singularity Hub article entitled, “Revolutionary New Brain Chip allows Monkeys to Grasp and Feel Objects Using Their Thoughts,” these emerging technological possibilities are almost impossible to ignore.
This article discusses how scientists have recently announced the creation of an implantable device that can be placed in the brain and which will allow for the control of computers by thought.
Dr. Miguel Nicolelis and company have already tested these devices in monkeys with stunningly accurate results. In addition to allowing the user to control the computer by thought, it also allows the user to feel the virtual object it is manipulating. This device is not the first of its kind.
For years, implants have allowed monkeys to control computer cursors and even robotic arms in laboratory settings.
In a 2011 experiment, two macaque monkeys were trained to control a virtual arm represented on the computer screen and use the arm to “grasp” virtual objects. The difference between this latest experiment and those that have preceded it, however, is that these monkeys were able to actually feel the objects they were grasping.
In a testament to just how fast the coming cyberization of mankind has progressed, a new report published by the Daily Mail entitled, “Hitler would have loved The Singularity – Mind-blowing benefits of merging human brains and computers,” reaffirms most of what I have been writing about for some time. Namely, that the merging of man and machine is much closer than the average person is willing to believe.
In the news report, Ian Morris, Professor of Classics and History at Stanford University and author of Why The West Rules – For Now, briefly overviews years of mainstream history involving the development and implementation of Singularity-related technologies.
Before going much further, however, it is important for the reader to understand just what is meant when the term “Singularity” is used.
Defined by TIME, “Singularity” is,
“The moment when technological change becomes so rapid and profound, it represents a rupture in the fabric of human history.”
Simply put, Singularity is the moment when man and machine merge to create a new type of human- a singular entity that contains property of both machines and humans. If the concept of Singularity is new to you, I suggest reading “The Singularity Movement, Immortality, and Removing the Ghost in the Machine.”
In this article, I discuss the premise behind the movement, and some of the implications it holds for basic human freedom, dignity, and even our own existence.
Unfortunately, Singularity is not a fringe movement as some might believe at first; it has a great number of followers, many of whom are in powerful positions. For instance, the Singularity University is a three-year-old institution that offers inter-disciplinary courses for both executives and graduate students.
It is hosted by NASA, a notorious front for secretive projects conducted by the government and the military-industrial complex. Not only that, but Google, which is yet another corporate front for intelligence agencies, was a founding sponsor of the University as well. It is this context in which Ian Morris writes his own article about the coming merger of human brains and computers.
Morris prefaces his commentary on Singularity by pointing out some mainstream (even if not well-known) facts regarding the development of technology that he, and many others who are informed on the subject, believes will allow for actually sending human thoughts over the Internet.
All of this, of course, will take place after human brains are chipped, or otherwise linked to computers.
Ten years ago, the US National Science Foundation predicted ‘network-enhanced telepathy’ – sending thoughts over the internet – would be practical by the 2020s.
And thanks to neuroscientists at the University of California, we seem to be on schedule.
Last September, they asked volunteers to watch Hollywood film trailers and then reconstructed the clips by scanning their subjects’ brain activity.
He continues by saying:
Last week, the scientists boldly went further still.
They charted the electrical activity in the brains of volunteers who were listening to human speech and then they fed the results into computers which translated the signals back into language.
The technique remains crude, and has so far made out only five distinct words, but humanity has crossed a threshold.
The threshold that Morris refers to is the moment where the merging of man and machine are announced to the general public, not necessarily the moment when it becomes possible.
Indeed, we know that any research or development announced to the general public is, in reality, much further behind the true capabilities of the technology. For instance, the ability to control brain function via computers or for brains to control computers by thought has been available for many years.
Only the crude forms of this technology have been introduced for mass consumption. Even so, the introduction came a great many years after the actual development.
Yet, after pointing out some of the positive aspects that this technology might present to humanity, such as providing speech to those impaired by neurodegenerative diseases, or movement to those suffering from paralysis, Morris points out some other rather disturbing directions this rapidly developing technology might take.
Disturbing, that is, if one is not part of the Singularity cult. Nevertheless, Morris moves through some innocuous and unquestionably beneficial developments such as eyeglasses and ear trumpets, which show the lengths to which technology has progressed and the relatively short time scale it has taken to do so.
These devices have either become a normal part of life, or have given way to other more advanced technologies. These more advanced devices such as hearing aids, dialysis machines, and pacemakers have all become normal and accepted machine additions as well.
However, as Morris writes:
By the second decade of the 21st Century, we have become used to organs grown in laboratories, genetic surgery and designer babies.
In 2002, medical researchers used enzymes and DNA to build the first molecular computers, and in 2004 improved versions were being injected into people’s veins to fight cancer
By 2020 we may be able to put even cleverer nano-computers into our brains to speed up synaptic links, give ourselves perfect memory and perhaps cure dementia.
If nano-computers implanted in our brains would indeed increase these functions of the human brain, making then possible the furthering of other related technological and other biotechnological advancements, then it is realistic to believe (as many in the Singularity movement do) that the human being as we know it will cease to exist.
The old man will be replaced by the new. That which was made imperfect would be made perfect.
This is exactly the future which Singularity promoters like Juan Enriquez have been foreseeing. Enriquez’s long resume affirms the fact that those in prominent positions hold fast to what is essentially a modern version of eugenics based on more than just mere ethnicity.
Enriquez himself states that humanity, by virtue of Singularity, will develop into an entirely different species.
The new human species is one that begins to engineer the evolution of viruses, plants, animals, and itself.
As we do that, Darwin’s rules get bent, and sometimes even broken. By taking direct and deliberate control over our evolution, we are living in a world where we are modifying stuff according to our desires…
Eventually, we get to the point where evolution is guided by what we’re engineering. That’s a big deal. Today’s plastic surgery is going to seem tame compared to what’s coming.
Enriquez also admits that, as a result of this emerging technology, a “new ethics” must be developed to go along with the opportunities for eugenics that now present themselves.
The issue of [genetic variation] is a really uncomfortable question, one that for good reason, we have been avoiding since the 1930s and ’40s.
A lot of the research behind the eugenics movement came out of elite universities in the U.S. It was disastrously misapplied. But you do have to ask, if there are fundamental differences in species like dogs and horses and birds, is it true that there are no significant differences in humans?
We are going to have an answer to that question very quickly.
If we do, we need to think through an ethical, moral framework to think about questions that go way beyond science.
However, the open promoters of Singularity such as Juan Enriquez and Ray Kurzweil are not the root of the movement.
As Morris points out, the funding of projects related to the merging of the human brain with that of the computer has been funded mostly by DARPA (Defense Advanced Research Projects Agency). After all, as Morris points out, it was DARPA that produced the Internet (called ARPANET in the 1970s) and it was DARPA’s Brain Interface Project that was the first voyage in molecular computing. As I mentioned earlier, however, one should be aware that even these projects that been announced and revealed to the general public are actually far behind in the true time scale of development.
DARPA’s research and discoveries are years or decades ahead of anything they introduce, even retroactively, to the scientific community at large, much less the general public.
This is why programs such as, Silent Talk, are exploring mind reading technology by virtue of reading the electrical signals inside the brains of soldiers, then broadcasting them for two-way communication with soldiers over the Internet.
As Morris writes,
“With these implants, entire armies will be able to talk without radios. Orders will leap instantly into soldiers’ heads and commanders’ wishes will become the wishes of their men.”
Add this to the fact that “mind reading” technology is already being rolled out in Western airports, and one can easily see an agenda at work.
A very crude version of the neuron-scanning technology discussed by Morris, these “Emotion Detectors” use video cameras and facial cues, as well as thermal imaging technology, to detect emotions that are unacceptable to “authorities.”
However, the technology Morris writes about is much more advanced than emotion scanners. Even the definition of “mind reading” in terms of the new interface programs tends to be more dynamic. Consider how Morris describes Ray Kurzweil’s prediction of where mind reading programs will go in the future.
Since the Sixties, computer chips have been doubling their speed and halving their cost every 18 months or so.
If the trend continues, the inventor and predictor Ray Kurzweil has pointed out that by 2029 we will have computers powerful enough to run programs reproducing the 10,000 trillion electrical signals that flash around your skull every second.
They will also have enough memory to store the ten trillion recollections that make you who you are. And they will also be powerful enough to scan, neuron by neuron, every contour and wrinkle of your brain.
What this means is that if the trends of the past 50 years continue, in 17 years’ time we will be able to upload an electronic replica of your mind on to a machine
There will be two of you – one a flesh-and-blood animal, the other inside a computer’s circuits.
And if the trends hold fast beyond that, Kurzweil adds, by 2045 we will have a computer that is powerful enough to host every one of the eight billion minds on earth.
Carbon and silicon-based intelligence will merge to form a single global consciousness.
The world being described here is not much different than the one presented in movies like The Matrix or Ghost in the Shell; a world where humans have been physically altered in order to be linked with the Internet.
In both movies, there is a version of the “single global consciousness” where cyberized humans are fully merged into the virtual world. Yet, although such technology has been portrayed as science fiction for years, the fact is that the Singularity is now a very real possibility.
As US Col. Thomas Adams stated, technology,
“is rapidly taking us to a place where we may not want to go, but probably are unable to avoid.”
He should know – Western militaries have been preparing for the Singularity for some time.
In this context, where war becomes literally ingrained, the dystopic vision of dark science fiction becomes promoted as a real-world solution. The concept and actual application of the control of the human brain by outside forces via a “brain chip” is itself nothing new, even in popular discourse.
For instance, in 1965, it was reported by the New York Times that Dr. Jose Delgado was able to control the brain and body of a bull via the use of a “brain chip.”
In the article entitled “Matador with A Radio Stops Wired Bull – Modified Behavior in Animals the Subject of Brain Study,” John A. Osmundsen reported,
Afternoon sunlight poured over the high wooden barriers into the ring, as the brave bull bore down on the unarmed “matador” – a scientist who had never before faced a fighting bull. But the charging animals horns never reached the man behind the red cape.
Moments before that could happen, Dr Jose Delgado, the scientist, pressed a button on a small radio transmitter in his hand and the bull braked to a halt. Then he pressed another button on the transmitter, and the bull obediently turned to the right and trotted away.
The bull was obeying commands in his brain that were being called forth by electrical stimulation – by the radio signals – of certain regions in which the fine wires had been painlessly implanted the day before.
The experiment, conducted last year in Cordova, Spain, by Dr Delgado of Yale University’s School of Medicine, was probably the most spectacular demonstration ever performed of the deliberate modification of animal behavior through external control of the brain. He has been working in this field for more than 15 years.
Techniques that he and other scientists have recently developed have been refined to the point where, he believes,
“a turning point has been reached in the study of the mind.”
“I do believe,” he said in a recent lecture, “that an understanding of the biological bases of social and antisocial behavior and of mental activities, which for the first time in history can now be explored in a conscious brain, may be of decisive importance in the search for intelligent solutions to some of our present anxieties, frustrations and conflicts.”
Dr Delgado’s contention that brain research has reached a stage of refinement where it can contribute to the solution of some. . .problems is based he said, on many of his own experiments.
These have shown, he explained, that,
“functions traditionally related to the psyche, such as friendliness, pleasure or verbal expression, can be induced, modified and inhibited by direct electrical stimulation of the brain.”
For example, he has been able to “play” monkeys and cats like little electronic toys that yawn, hide, fight, play, mate and go to sleep on command.
With such techniques, Dr Delgado has shown:
Monkeys will learn to press a button that sends a stimulus to the brain of an enraged member of the colony and calms it down, indicating that animals can be taught to control each others behavior.
A monkey, stimulated to extremely aggressive behavior will make “intelligent” attacks only on competitive members of the colony, sparing other, friendlier, ones.
Monkeys and cats can be triggered into sequential behavior in which one might open its mouth, turn around, walk to a corner, climb a wall, jump down and return to “start,” repeating those movements in the same order every time they are stimulated but will modify the pattern if other animals get in the way or if they are threatened.
The latter two experiments show that electrical brain stimulation does not simply evoke automatic responses but reactions that become integrated into the social behavior according to the individuals own personality or temperament, Dr Delgado said.
Public Relations, Propaganda, and Coercion
Without a doubt, many of the stories mentioned above were reported by the mainstream media outlets not for the purpose of alerting citizens to a coming violation of their privacy and/or personal/civil liberties nor even to begin a debate about the merits or negative aspects to implanting microchips in humans, but as predictive programming to condition the average person to accept such implantation.
For years, secretive documents had discussed the idea of possible micro-chipping of the human population. Over recent years, however, mainstream media reports on major news organizations have consistently “revealed” isolated incidents of individuals volunteering for chipping or being otherwise coerced by their employers in other countries.
Along with these stories – presented solely for the purposes of conditioning the public to accept the idea and practice of chipping in “some” instances and thus all “other” instances as well – came the stories predicting that by year “*insert year here*” all Americans would be fully micro-chipped.
In addition to predictive programming, these articles and predictions are fully designed to present the reader with a sense of inevitability that the culture of chipped humans will indeed exist in the near future. For instance, the Dateline episode mentioned at the beginning of this article is case in point, predicting that all Americans will be micro-chipped by 2017 with an air of relative certainty.
In his book, Age of Spiritual Machines, Transhumanism proponent Ray Kurzweil predicted that computers would be implanted in the brains of humans allowing them to access the Internet with their minds by 2019, creating a “human underclass” of those individuals who refuse to become part of the Singularity.
The UK Ministry of Defense’s Concept and Doctrine Centre wrote a 90-page document which, according to the Guardian, predicted that,
“By 2035, an implantable ‘information chip’ could be wired directly to the brain.
A growing pervasiveness of information communications technology will enable states, terrorists or criminals, to mobilize ‘flashmobs’, challenging security forces to match this potential agility coupled with an ability to concentrate forces quickly in a small area.”
Unfortunately, the method of predictive programming and the sense of inevitability (a technique of MindWar) is exceedingly effective.
Indeed, while the vast majority of Americans are entertaining themselves into a prison cell and have no concern for anything other than their most basic needs and entertainment/pleasure avenues, others will jump at the chance to be at the head of the trendy line.
In the ranks of those who find this trend concerning and are aware of the agenda, most are members of the alternative media community who have fallen into complacency or the unfortunate trend of doing nothing more than panicking, hiding, and yapping about the problem as opposed to organizing a counterattack.
Others still are convinced by their religion that a micro-chipped and controlled population is “God’s Will” and thus, inevitable and unchangeable.
The beginning steps of introducing the idea of chipping humans was move to microchip pets to prevent their becoming lost. That idea, so “successful” in animals, then turned towards humans – children, mentally ill, elderly – and ultimately to every other person regardless of health or mental faculties.
Clearly, Will Weissert correctly described the chip in his article by stating that,
“The chip originally was developed to track livestock and wildlife and to let pet owners identify runaway animals.”
In another comparison, it is important to point out that the procedure of implanting animals and pets with microchips was, at one point, considered taboo.
After a significant media campaign, however, bringing veterinarians, animal welfare advocates, concerned pet owners, and other “experts” on board, the tide of public opinion slowly began to shift. Now, one can scarcely find an animal that has not been chipped, particularly at any shelter or pet store.
This is much the same media campaign that has been implemented in the promotion of human chipping. Combined with fear, “snob appeal” is and will be used to promote the chipping of the human population.
The chip will first be used to allay fear of terrorism, death, or losing one’s family members and will then, as in the case of the Spanish nightclub, take on a more trendy and desirable presentation.
Cool people have chips. Rich people have chips. Only religious fundamentalists and squares refuse them.
Once again, convenience and social class will be used to drag Americans and, eventually, the rest of the world into an even greater, perhaps irrevocable, prison than that which they are already in.
Yet, we can go even further back to another media campaign to see the precedent for the eventual forced chipping in the debate over mandatory vaccination where individuals are forced to inject something into their bodies that they do not consent to or approve of having injected.
This very fact was admitted by the Air Force itself in a 1996 article entitled “Implanted Microscopic Chip”. In this article, the authors admit that the precedent for chip implantation is the process of vaccination and, specifically, the policy of mandatory vaccination.
The article reads,
Ethical and Public Relations Issues. Implanting “things” in people raises ethical and public relations issues.
While these concerns may be founded on today’s thinking, in 2025 they may not be as alarming. We already are evolving toward technology implanting.
For example, the military currently requires its members to receive mandatory injections of biological organisms (i.e., the flu shot). In the civilian world, people receive mechanical hearts and other organs. Society has come to accept most of these implants as a fact of life.
By 2025 it is possible medical technology will have nerve chips that allow amputees to control artificial limbs or eye chips that allow the blind to see. The civilian populace will likely accept an implanted microscopic chips that allow military members to defend vital national interests.
Further, the US military will continue to be a volunteer force that will freely accept the chip because it is a tool to control technology and not as a tool to control the human.
After looking at how many different applications and programs are currently being developed and introduced, with absolutely no prior demand from the consuming public, one thing is clear – there is a coordinated effort to implement these types of cashless, and now even person-less, transactions and to persuade the general public to accept, use, and ultimately, become enslaved by them.
The marketing campaigns attached to these technologies are not so much advertisements as they are culture creation and perception management. The coming cashless society is clearly a top-down system that is being introduced; not one that has emerged from grassroots demand.
Still, there are many who would have a hard time believing that any cashless digital system would be used for anything other than a benevolent purpose. Surely the banks and corporations that have ruined their lives in a myriad of different ways wouldn’t seek to wield any more control over them than they already do. Right?
Unfortunately, anyone who is willing to follow this incredibly naïve way of thinking is in for a rude awakening.
The road map is clear.
First, this technology will be introduced as a cool, convenient, trendy app, complete with snob appeal, with which to impress your friends.
Then, soon after it becomes more affordable, more available, and easier to use, there will be incentives offered (discounts, “credits,” etc.) to entice more and more people to use these methods of payment or identification.
Once these methods become common place, we will begin to see the withering away of “outdated” and “archaic” methods of payment like cash and checks or traditional – even biometric – forms of identification.
However, managing “your entire financial life from a single device” comes with potential pitfalls. This has been highlighted, albeit unintentionally, by those who wish to promote this new wave of transaction.
Indeed, in an effort to promote the use of digital payment apps and to neutralize the security and privacy questions which massively surround their use, the USA TODAY article mentioned above states,
“Phones (and apps) can be password-protected. Security elements are built into the NFC chips. It’s easy to remotely shut down a digital wallet if necessary.”
Of course they can.
So can any microchip implanted under the skin. But most of those using such devices rarely question whether that capability will be used for their enslavement.
When all forms of transactions become cashless, whether they end up on a card or on a smartphone app, it will only be a short amount of time before banks, corporations, cell phone companies, and governments use the power they have over the users’ accounts to force the consumer and the citizen to bend to their will.
It’s not hard to see how the routine will go…
Didn’t pay your bill? We’ll freeze your account until you do.
Didn’t register for the draft? We’ll freeze your account. After all, you can’t enjoy the benefits of living in the free world without pulling your share of the weight.
Past-due parking tickets? We’ll freeze your account until you get it all worked out.
Didn’t take the latest vaccine?
You get the idea…
 The bill does not mandate the implantation of microchips. It calls for the creation of a registry of medical devices including those that can be implanted under the guise of research and monitoring effectiveness. These devices can be anything from pacemakers to microchips but there is no mention of forced implantation.
by Brandon Turbeville, April 22, 2015, from ActivistPost Website