Wednesday, October 18, 2017

FBI uncovered Russian bribery plot before Obama administration approved controversial nuclear deal

Before the Obama administration approved a controversial deal in 2010 giving Moscow control of a large swath of American uranium, the FBI had gathered substantial evidence that Russian nuclear industry officials were engaged in bribery, kickbacks, extortion and money laundering designed to grow Vladimir Putin’s atomic energy business inside the United States, according to government documents and interviews.
Federal agents used a confidential U.S. witness working inside the Russian nuclear industry to gather extensive financial records, make secret recordings and intercept emails as early as 2009 that showed Moscow had compromised an American uranium trucking firm with bribes and kickbacks in violation of the Foreign Corrupt Practices Act, FBI and court documents show.
They also obtained an eyewitness account — backed by documents — indicating Russian nuclear officials had routed millions of dollars to the U.S. designed to benefit former President Bill Clinton’s charitable foundation during the time Secretary of State Hillary Clinton served on a government body that provided a favorable decision to Moscow, sources told The Hill.
The racketeering scheme was conducted “with the consent of higher level officials” in Russia who “shared the proceeds” from the kickbacks, one agent declared in an affidavit years later.
Rather than bring immediate charges in 2010, however, the Department of Justice (DOJ) continued investigating the matter for nearly four more years, essentially leaving the American public and Congress in the dark about Russian nuclear corruption on U.S. soil during a period when the Obama administration made two major decisions benefiting Putin’s commercial nuclear ambitions.The first decision occurred in October 2010, when the State Department and government agencies on the Committee on Foreign Investment in the United States unanimously approved the partial sale of Canadian mining company Uranium One to the Russian nuclear giant Rosatom, giving Moscow control of more than 20 percent of America’s uranium supply.
When this sale was used by Trump on the campaign trail last year, Hillary Clinton’s spokesman said she was not involved in the committee review and noted the State Department official who handled it said she “never intervened ... on any [Committee on Foreign Investment in the United States] matter.”
In 2011, the administration gave approval for Rosatom’s Tenex subsidiary to sell commercial uranium to U.S. nuclear power plants in a partnership with the United States Enrichment Corp. Before then, Tenex had been limited to selling U.S. nuclear power plants reprocessed uranium recovered from dismantled Soviet nuclear weapons under the 1990s Megatons to Megawatts peace program.
“The Russians were compromising American contractors in the nuclear industry with kickbacks and extortion threats, all of which raised legitimate national security concerns. And none of that evidence got aired before the Obama administration made those decisions,” a person who worked on the case told The Hill, speaking on condition of anonymity for fear of retribution by U.S. or Russian officials.
The Obama administration’s decision to approve Rosatom’s purchase of Uranium One has been a source of political controversy since 2015.
That’s when conservative author Peter Schweitzer and The New York Times documented how Bill Clinton collected hundreds of thousands of dollars in Russian speaking fees and his charitable foundation collected millions in donations from parties interested in the deal while Hillary Clinton presided on the Committee on Foreign Investment in the United States.
The Obama administration and the Clintons defended their actions at the time, insisting there was no evidence that any Russians or donors engaged in wrongdoing and there was no national security reason for any member of the committee to oppose the Uranium One deal.
But FBI, Energy Department and court documents reviewed by The Hill show the FBI in fact had gathered substantial evidence well before the committee’s decision that Vadim Mikerin — the main Russian overseeing Putin’s nuclear expansion inside the United States — was engaged in wrongdoing starting in 2009.
Then-Attorney General Eric Holder was among the Obama administration officials joining Hillary Clinton on the Committee on Foreign Investment in the United States at the time the Uranium One deal was approved. Multiple current and former government officials told The Hill they did not know whether the FBI or DOJ ever alerted committee members to the criminal activity they uncovered.
Spokesmen for Holder and Clinton did not return calls seeking comment. The Justice Department also didn’t comment.
Mikerin was a director of Rosatom’s Tenex in Moscow since the early 2000s, where he oversaw Rosatom’s nuclear collaboration with the United States under the Megatons to Megwatts program and its commercial uranium sales to other countries. In 2010, Mikerin was dispatched to the U.S. on a work visa approved by the Obama administration to open Rosatom’s new American arm called Tenam.
Between 2009 and January 2012, Mikerin “did knowingly and willfully combine, conspire confederate and agree with other persons … to obstruct, delay and affect commerce and the movement of an article and commodity (enriched uranium) in commerce by extortion,” a November 2014 indictment stated.
His illegal conduct was captured with the help of a confidential witness, an American businessman, who began making kickback payments at Mikerin’s direction and with the permission of the FBI. The first kickback payment recorded by the FBI through its informant was dated Nov. 27, 2009, the records show.
In evidentiary affidavits signed in 2014 and 2015, an Energy Department agent assigned to assist the FBI in the case testified that Mikerin supervised a “racketeering scheme” that involved extortion, bribery, money laundering and kickbacks that were both directed by and provided benefit to more senior officials back in Russia.
“As part of the scheme, Mikerin, with the consent of higher level officials at TENEX and Rosatom (both Russian state-owned entities) would offer no-bid contracts to US businesses in exchange for kickbacks in the form of money payments made to some offshore banks accounts,” Agent David Gadren testified.
“Mikerin apparently then shared the proceeds with other co-conspirators associated with TENEX in Russia and elsewhere,” the agent added.
The investigation was ultimately supervised by then-U.S. Attorney Rod Rosenstein, an Obama appointee who now serves as President Trump’s deputy attorney general, and then-Assistant FBI Director Andrew McCabe, now the deputy FBI director under Trump, Justice Department documents show.
Both men now play a key role in the current investigation into possible, but still unproven, collusion between Russia and Donald Trump’s campaign during the 2016 election cycle. McCabe is under congressional and Justice Department inspector general investigation in connection with money his wife’s Virginia state Senate campaign accepted in 2015 from now-Virginia Gov. Terry McAuliffe at a time when McAuliffe was reportedly under investigation by the FBI. The probe is not focused on McAuliffe's conduct but rather on whether McCabe's attendance violated the Hatch Act or other FBI conflict rules.
The connections to the current Russia case are many. The Mikerin probe began in 2009 when Robert Mueller, now the special counsel in charge of the Trump case, was still FBI director. And it ended in late 2015 under the direction of then-FBI Director James Comey, whom Trump fired earlier this year.
Its many twist and turns aside, the FBI nuclear industry case proved a gold mine, in part because it uncovered a new Russian money laundering apparatus that routed bribe and kickback payments through financial instruments in Cyprus, Latvia and Seychelles. A Russian financier in New Jersey was among those arrested for the money laundering, court records show.
The case also exposed a serious national security breach: Mikerin had given a contract to an American trucking firm called Transport Logistics International that held the sensitive job of transporting Russia’s uranium around the United States in return for more than $2 million in kickbacks from some of its executives, court records show.
One of Mikerin’s former employees told the FBI that Tenex officials in Russia specifically directed the scheme to “allow for padded pricing to include kickbacks,” agents testified in one court filing.
Bringing down a major Russian nuclear corruption scheme that had both compromised a sensitive uranium transportation asset inside the U.S. and facilitated international money laundering would seem a major feather in any law enforcement agency’s cap.
But the Justice Department and FBI took little credit in 2014 when Mikerin, the Russian financier and the trucking firm executives were arrested and charged.
The only public statement occurred a year later when the Justice Department put out a little-noticed press release in August 2015, just days before Labor Day. The release noted that the various defendants had reached plea deals.
By that time, the criminal cases against Mikerin had been narrowed to a single charge of money laundering for a scheme that officials admitted stretched from 2004 to 2014. And though agents had evidence of criminal wrongdoing they collected since at least 2009, federal prosecutors only cited in the plea agreement a handful of transactions that occurred in 2011 and 2012, well after the Committee on Foreign Investment in the United States’s approval.
The final court case also made no mention of any connection to the influence peddling conversations the FBI undercover informant witnessed about the Russian nuclear officials trying to ingratiate themselves with the Clintons even though agents had gathered documents showing the transmission of millions of dollars from Russia’s nuclear industry to an American entity that had provided assistance to Bill Clinton’s foundation, sources confirmed to The Hill.
The lack of fanfare left many key players in Washington with no inkling that a major Russian nuclear corruption scheme with serious national security implications had been uncovered.
On Dec. 15, 2015, the Justice Department put out a release stating that Mikerin, “a former Russian official residing in Maryland was sentenced today to 48 months in prison” and ordered to forfeit more than $2.1 million.
Ronald Hosko, who served as the assistant FBI director in charge of criminal cases when the investigation was underway, told The Hill he did not recall ever being briefed about Mikerin’s case by the counterintelligence side of the bureau despite the criminal charges that were being lodged.
“I had no idea this case was being conducted,” a surprised Hosko said in an interview.
Likewise, major congressional figures were also kept in the dark.
Former Rep. Mike Rogers (R-Mich.), who chaired the House Intelligence Committee during the time the FBI probe was being conducted, told The Hill that he had never been told anything about the Russian nuclear corruption case even though many fellow lawmakers had serious concerns about the Obama administration’s approval of the Uranium One deal.
“Not providing information on a corruption scheme before the Russian uranium deal was approved by U.S. regulators and engage appropriate congressional committees has served to undermine U.S. national security interests by the very people charged with protecting them,” he said. “The Russian efforts to manipulate our American political enterprise is breathtaking.”

Monday, October 09, 2017

How The Solar Warden And Blue Sphere Alliance Will Change Life As We Know It

By now, many people have heard about the Solar Warden group that hacker Gary McKinnon talked about uncovering in one of his hacking sessions.

In 2010, investigative reporter Darren Perks requested more information about the Solar Warden Program from the Department of Defense through the Freedom of Information Act.

He received the following response confirming the Solar Warden project existed:

About an hour ago I spoke to a NASA rep who confirmed this was their program and that it was terminated by the President. He also informed me that it was not a joint program with the DoD. The NASA rep informed me that you should be directed to the Johnson Space Center FOIA Manager.

I have ran your request through one of our space-related directorates and I’m waiting on one other division with the Command to respond back to me. I will contact you once I have a response from the other division. Did NASA refer you to us?

Parks additionally stated:

When Gary McKinnon hacked into U.S. Space Command computers several years ago and learned of the existence of “non-terrestrial officers” and “fleet-to-fleet transfers” and a secret program called “Solar Warden”, he was charged by the Bush Justice Department with having committed “the biggest military computer hack of all time”, and stood to face prison time of up to 70 years after extradition from UK. But trying earnest McKinnon in open court would involve his testifying to the above classified facts, and his attorney would be able to subpoena government officers to testify under oath about the Navy’s Space Fleet.

McKinnon also found out about the ships or craft within Solar Warden. It is said that there are approx eight cigar-shaped motherships (each longer than two football fields end-to-end) and 43 small “scout ships. The Solar Warden Space Fleet operates under the US Naval Network and Space Operations Command (NNSOC) [formerly Naval Space Command]. There are approximately 300 personnel involved at that facility, with the figure rising.

Solar Warden is said to be made up from U.S. aerospace Black Projects contractors, but with some contributions of parts and systems by Canada. United Kingdom, Italy, Austria, Russia, and Australia. It is also said that the program is tested and operated from secret military bases such as Area 51 in Nevada, USA.

Project Camelot’s Kerry Cassidy interviewed military whistleblower Mark Richards, who also confirmed the existence of this program:

We’ve recently heard about the ET group, the Blue Avians, who are associated with the Blue Sphere Alliance and how they began to arrive into our solar system since 1999. According to contactee Corey Goode, there are over 100 Blue Spheres helping our planet to push for complete disclosure and the release of suppressed technology. Goode stated that some of these spheres are as large as some of the biggest planets in our solar system.

This is a brief synopsis of Goode provided by Dr. Michael Salla:

Corey Goode (aka GoodETxSG) has revealed more about his claimed contact experiences with a group of extraterrestrials called the Sphere Alliance that is cooperating with an alliance of secret space programs indigenous to Earth. He has earlier revealed that there are five secret space programs belonging to our current Earth civilization, as well as between five to seven space programs belonging to ancient human civilizations that are still operating. Several of these secret space programs indigenous to Earth make up what he calls the Secret Space Program (SSP) Alliance, which should not be confused with the Sphere Alliance which he claims comprise five different extraterrestrials races that are here to assist humanity deal with an upcoming “Event” that will change life dramatically on the planet.

Corey claims to have attended several off-planet meetings where three of these extraterrestrial races belonging to the Sphere Alliance, have appeared. The alien race which he has communicated with the most is called the Blue Avians – one of whom he met with as recently as this past weekend is called “Tear-Eir”. In addition to his recent contact experiences, Corey also claims to have earlier served, from 1987 to 2007 with several secret space programs as an Intuitive Empath. In an earlier question and answer email exchange, Corey described the five secret space programs as follows:

Solar Warden – mainly focused on policing the Solar System and surrounding Star Clusters;
Interplanetary Corporate Conglomerate (ICC) – focused mainly on development and aquisition of technology by any means.
Dark Fleet – worked almost entirely outside the Sol System, Very Military (Offensive),
“NATO TYPE SSP” – Recently in Alliance Conferences they were referred to as the “League of Nations Program.”
Various Special Access Program SSP’s that were small, usually had the newer technology, very secretive and worked for some of the Secret Earth Governments
This is David Wilcock’s account:

“100 spheres the size of either Neptune or Jupiter have entered our solar system in the last 2-3 years. There were many others the size of our Moon that came in between 1999 and 2001…

The Cabal used an advanced weapon to fire on one of the Moon-sized ones in mid-December. The object lit up, becoming a bright red spot in the sky that people saw… and redirected the beam back to the base it came from. This caused serious damage and some loss of life… Since that time we have had the no-fly zone around the earth and the quarantine around our solar system. No one who was already here is being allowed to leave — either from Earth or any of the “colonies,” as they are called.

The group that has appeared from the spheres is called the “Blue Avians.”… One of the five main factions in the Space Program, named Solar Warden, is allied with them now, and they are working with the Alliance on earth, but never the Cabal… The… main thing they said was that Cabal people were defecting over into the Alliance and altering the plans.

The key is that the Avians want us to move into a loving and peaceful world. They want this transition, similarly, to be as peaceful as possible.”

Goode gave further confirmation of the Solar Warden in his interview with Wilcock on the Jimmy Church Show:

In Dr. Salla’s Q&A session with Goode, Salla asked:

“Is there currently a disclosure race happening between the Cabal/Illuminati and the Solar Warden/Sphere Alliance in terms of how much information should be released and how this frames key issues?”

Goode stated:

Yes, When FULL Disclosure occurs it will also come with the disclosure of massive crimes against humanity that will send the bulk of society into shock for some time. The fact that there are aliens will not be as disruptive or shocking as what all the “Elites” have done in secret during the time they were hiding the existence of aliens and most of all the high technology that would have ended the financial system decades ago.

IF the “CABAL/ILLUMINATI” can control the Disclosure they can also control the narrative and spin of disclosure they think to make themselves come out looking better in the end. They had planned on being off planet when all this info broke. Now that they are stuck here with everyone else they are all turning on each other and spinning and cutting deals as fast as they can. They know what the population will call for if there is full and unfettered disclosure of our true history along with the release of suppressed technologies and the existence of alien life.

They MUST do everything they can to control the narrative and manner that this information is disclosed. This is a much more complicated situation than most realize. Complete disclosure is going to leave many of the world’s population calling for “heads to roll”.

Apparently, negative factions are trying to flee the planet and solar system but are unable to due to help from the Sphere Alliance, to which Goode stated:

“The Sphere Alliance has setup 2 “Energy Barriers” that cannot be penetrated by any beings or technology (be they 3/4D or 4/5D beings). One is the “Outer Barrier” which completely closes off the entire Sol System (including Gate/Portal travel) from travel into or out of the system. The other is a “Barrier” that surrounds the Earth. It keeps every Being and Craft/Technology that was either On Earth or in Near Earth Orbit from leaving a certain distance from the Planet. There are active Gates/Portals that allow travel between Planets within the Sol System. However those Portals are affected by these “Energetic Tsunami Waves” at times and regular travel is ill advised and can be dangerous.

There are also a lot of recent skirmishes between different “Cabal” factions since they have realized that their “Royal White Reptilians” have tried to negotiate their safe passage out of the Sol System by giving up all of their Illuminati/Cabal followers as well as their lower cast Reptilian and Other Allied Races in their Federation. There is chaos among these groups who are following the example of their masters and turning on each other. Though they are heavily denying this publicly and putting their most clever disinformation agents out to control the narrative as best they can.”

You can find updates on Corey’s website, Sphere Being Alliance.

Corey will be doing weekly interviews with David Wilcock on GaiamTV.  The first one is July 14th, 2015.   Click here to subscribe.

This falls into alignment to what Dr. Simon Atkins stated about CERN… that it’s being used to prevent the incoming intergalactic wave of energy that will transform consciousness. If this happens, it would immediately expose everyone who has committed crimes against humanity.

The ruling elite and ET factions may be trying to use CERN as a means of creating a portal to escape but because of the barriers placed over the planet by the Sphere Alliance, this cannot happen. Basically, they’re all screwed and will be turning on one another while trying to save their own asses.

If all of this comes to fruition, you can expect to see a world like you’ve never seen before! Imagine if all of the suppressed technologies were released? Imagine receiving advanced technologies from our galactic neighbors, such as a replicator that can instantly manifest anything you want to eat?

While it all sounds like science fiction right now, there are enough sources and confirmations to give this information credibility.

Friday, September 22, 2017

Blizzard knuckles down on community as Valve fiddles

taken exactly the kind of firm, decisive action we've come to expect of the company; it's added some graphs to the interface. Nothing's going to top the sheer shade of Gamasutra's headline on this topic - "Steam considers preventing review bombs, adds graphs instead" - which perfectly sums up the exasperation so many developers feel with the service at this point. "We're aware of the problem, and doing as little as humanly possible to fix it" is in many ways the worst response the company could have conjured.

Review bombs, for the uninitiated, are when a mob descends on Steam to leave a flood of poor reviews - often accompanied by a grossly offensive comment - on a game whose developer has done something to raise their hackles. The most recent example was the review-bombing of Firewatch after its developers demanded PewDiePie remove videos of the game following his latest racist "slip of the tongue", but dozens of developers have been targeted for this kind of treatment, often simply for the sin of being female or a minority, or for in some way otherwise upsetting the denizens of Reddit and 4chan. Review bombing can hammer a game's rating on Steam and consequently seriously damage its sales - it's a direct and effective way for online mobs to damage the livelihoods of developers.

    "What's most frustrating is that the data Valve is now revealing makes it clear how trivial it would be to fix this problem if they actually cared"

Valve's response has been to effectively shrug off responsibility for these problems and place it entirely on the shoulders of its customers. Steam users will now be expected to look at a graph for a game to make sure it's not being review bombed, a process that assumes that they've bothered looking at a game with a low review score in the first place, that they understand what review bombing is, and that they know what it looks like on a graph - all of which seem vanishingly unlikely, making this move entirely unhelpful. What's most frustrating is that the data Valve is now revealing makes it clear how trivial it would be to fix this problem if they actually cared; an automated system that flagged possible review bombing for human review would be beyond trivial to implement.

This is the crux; the problem is not technological, it's philosophical. Valve clearly has a strong philosophy and political standpoint about how online services should work and about how the freedom of users (including users who haven't actually bought a game but still want to give it a 1-star review and call its creator a variety of colourful names) should be balanced against the protection of creators' livelihoods. Moreover, it has a belief that where problems do arise, they can be fixed with more data and tweaked algorithms; this reflects the broad Silicon Valley ideological fervour about the power and purity of algorithms, which conveniently ignores the extraordinary degree to which code that deals with human behaviour tends to reflect the implicit biases of its authors. Valve absolutely knows it could fix this issue with tighter rules, more human supervision and, yes, some better algorithms to report back to those supervisors; it just doesn't want to do that, because that's not its philosophy.

On the flip side of this coin, you've got Blizzard - a company which embraces technology with every bit as much fervour as Valve, but takes a very different approach to protecting its players and policing its community. Last week, Overwatch game director Jeff Kaplan laid things out very clearly - the team has absolutely refocused personnel and resources away from other features and content in order to address player toxicity in the Overwatch community.

    "There will always be a subset of people who get their kicks from aggressively ruining other people's fun... For them, harassment and abuse is the game, more so than the game itself"

It's throwing people, time and money into making its blockbuster game into an environment where people can play without facing abuse and harassment. Some of that effort boils down to developing smarter systems - Kaplan bemoaned the fact that resources have been diverted from other features in order to implement player-reporting on consoles, a feature Overwatch initially lacked on those platforms in the belief that consoles' own native player management systems would suffice. Other simple aspects of the game's systems and designs are clearly designed to avoid player harassment - dropping kill:death ratios from the scoreboard is one such move, for example.

Ultimately, though, the work Blizzard is doing on Overwatch - and that other companies that genuinely care about this issue, such as League of Legends creator Riot, are doing on their games - is about people, not algorithms. Clever design and good code gets you so far in managing human behaviour, but there will always be a subset of people who get their kicks from aggressively ruining other people's fun, and will devote themselves to learning to game any system you throw in their way. For them, harassment and abuse is the game, more so than the game itself; to deal with them, ultimately, you need living, breathing people empowered to make good decisions on behalf of the community.

This is, perhaps, what Valve doesn't understand - or is the understanding which it rejects on the grounds of its philosophy. Review bombing is a form of "gaming" Steam's systems; it arose because groups of people discovered that it was possible to manipulate Steam's data and algorithms in a way that achieved their political objectives. You can't fix that by just providing more data; the only way to prevent it from impacting on the income and livelihoods of developers (who, let's remember, give Steam a hell of a cut of their sales, and deserve to be treated with a damned sight more respect by Valve) is to seal up loopholes and add a layer of human supervision, in the form of people with clear priorities to protect creators and preserve Steam as a great retail platform, not a vector for online abuse mobs.

Blizzard's philosophy here (and again, it's not just Blizzard - plenty of other game operators also recognise and value of properly policing and improving their communities) is a great counter-example, and a window into what a service like Steam could be like if Valve rooted out some of this counter-productive philosophy from its way of doing business.

Steam is one of the most important developments in the games business in the last 20 years; it effectively rescued PC gaming from commercial doldrums and placed it on a path to the rude health it's in now. Its very importance, both commercially and culturally, cries out for better management; for it to slide into being a hostile environment for developers and a vector for online abuse (and thus also a pretty unpleasant place for the average consumer to shop for games) is a deeply worrying and unfortunate outcome.

Valve's critics don't pile on the company out of dislike, but out of disappointment at seeing such a vital part of the industry go to seed; if the company is really serious about fixing these issues, we'd suggest that a good first step would be to put down the graphs and see if maybe Blizzard is free for a chat on the phone.

Tuesday, August 29, 2017

Switch is Nintendo's last console

"Nintendo Switch. Starting to feel a little hype, not gonna lie."
That was me in October, following Nintendo's reveal of the upcoming console. I was perfectly fine with with the initial reveal of the Switch, feeling that glimmer of hope that, as an admitted Nintendo fangirl, I have grown all too familiar with. However, I am also familiar with disappointment.
The Wii U was an epic flop, so much so that recapping the ways in which the system fell short is unnecessary. And in all honesty, the Switch is what the Wii U should have been in the first place. The Switch gives players multiple ways to play without the limitations of its predecessor, offering a bridge between traditional gameplay on a TV and the on-the-go appeal of the 3DS. Unfortunately, the upcoming system is coming just a bit too late, and brings with it plenty of red flags of its own.
switchcastle fb
The price is too high at $299, especially when you factor in that there will be no bundled game and the included Joy-Con grip will not charge the Joy-Con controllers - you'll need to buy that separately. In addition, the replacements and peripherals revealed for the console are also too high. A pair of replacement Joy-Con controllers will set you back $80, while individuals will cost $50. A replacement dock set is $90, and the Nintendo Switch Pro Controller is $70. While the console itself feels overpriced, particularly when noting the absence of a bundled game, the replacements and the peripherals are almost shockingly expensive.
While the Wii U has been heavily criticized for coming out too strong and failing to maintain third party support and a healthy, consistent flow of games, Nintendo appears to be doing the exact opposite with the Switch in what is too drastic a pendulum swing. The lineup - both at launch and during its first year on the market - is concerning. At this point, the Nintendo Switch will be releasing with ten games in the US - Zelda: Breath of the Wild, 1-2-Switch, Just Dance 2017, Skylanders Imaginators, The Binding of Isaac: Afterbirth+, I Am Setsuna, World of Goo, Little Inferno, Human Resource Machine, and Super Bomberman R. The two Nintendo launch titles are Zelda, which will be releasing simultaneously on the Switch and Wii U, and 1-2-Switch, which is a collection of mini-games that should have been bundled in with the console but will instead set you back $50 so that you can pretend to eat a hoagie or milk a cow while making awkward eye contact with your friends.
As of right now, Nintendo has fewer confirmed Switch games releasing in all of 2017 than the Wii U had at the time of its launch.
As of right now, Nintendo has fewer confirmed Switch games releasing in all of 2017 than the Wii U had at the time of its launch. While it's possible more will be revealed, this isn't the most comforting list to have in front of you with less than two months to launch - and, if there was going to be another large title launching alongside the console, I have to assume we would have heard something about it by this point. According to Nintendo of America President Reggie Fils-Aime, this is intentional. Speaking with CNET, Fils-Aime stated that Nintendo has a "great ongoing march of content to motivate you to jump into the platform."
"Launch day is not the be-all and the end-all," Fils-Aime said. "It really is the steady pacing of content that continually reinforces for the people who bought into the platform why they made a smart choice, as well as what compels people who might be sitting on the sidelines to jump in."
However, it is questionable how long this content will last, even when released at a steady pace. In the 2017 state of the industry report issued by the Game Developers Conference, more than 4,500 developers answered a variety of questions - including what consoles they were currently developing for. Only 3% confirmed that they are currently developing for the Nintendo Switch, while 22% are developing for Xbox One/Scorpio, 27% are developing for PlayStation 4/Pro, 38% are developing for Smartphones and Tablets, and 53% are developing for PC/Mac. Fewer developers admitted to developing for the Switch than confirmed development for Apple TV.
The release schedule is hardly the biggest hurdle that the Switch faces. Prior to last week's detailed Switch presentation, I penned an editorial, laying out a list of must-haves for the console to be a sure success. Looking back, I realize I was too optimistic with what I had thought were simple expectations. While a strong launch and strong third party support were the first points I argued for - and, sadly, am unable to check off - I also argued that the Switch needs good battery life. If one of the major points is that it's a hybrid, with equal importance given to both docked and undocked play, then the battery has to reflect that. However, Nintendo has revealed that the battery on the Switch between charges will last 2.5 to 6.5 hours, depending on the game you are playing, with Breath of the Wild able to last three hours between charges. The one bonus here is that you can charge the Switch with a USB-C, and you are able to play while charging. However, unless you're carrying around a compatible power bank this will drastically limit the portability of the console.
Price for value is another issue. The Switch has 32GB of internal storage, making digital downloads very unattractive for gamers. While the console's storage can be expanded via micro SD card, a good SDXC card with a capacity of 256GB retails for up to $200, giving you less internal storage than a 500GB PlayStation 4 at a greater cost. While the portability does add value to the Switch, the above issue of battery life makes it a bit less appealing.
The truth is, Nintendo's consoles are what we put up with in order to access the games we want to play.
The Escapist's Steven Bogos shared his hands-on impressions of the Switch from PAX South, stating that the Switch, when viewed as a new handheld, is impressive enough to replace Nintendo's 3DS, but that as a home console "the Nintendo Switch is a bust."
"It's not as powerful as a PS4 (made obvious by the framerate issues I had on Breath of the Wild) which means that it's going to have the same third-party issues that the Nintendo always has," he added.
Bad timing and bad choices have led Nintendo to a pivotal moment. The gaming industry has always benefited from competition, but everything that we know of the Switch shows that Nintendo only appears to be competing with one company - itself. The Switch is too limited to replace smartphones and tablets, and it is far too weak and limited, in both its hardware and third party support, to be a true home console competitor. As much as I would love to see Nintendo succeed, I cannot shake the feeling that the Switch will be yet another hardware flop. Nintendo's value is not in its hardware, as the company is more interested in weird gimmicks than actual gamer-friendly home consoles. No, Nintendo's value is in its IPs - Mario, Zelda, Pokemon, Splatoon, METROID (cough, cough) - which are still gaming staples with the ability to captivate players. What these franchises deserve, more than anything, is a home worthy of them. The truth is, Nintendo's consoles are what we put up with in order to access the games we want to play.
Nintendo giving up on consoles and focusing on software is an idea I've tossed around for the past few years, particularly since the company began teasing the new console. "If this doesn't work, Nintendo should just focus on games," I'd say, and even the most stalwart of fans have struggled to argue against that point. While I love the 3DS, and can easily see Nintendo maintaining a strong presence in the portable gaming market, there is still so much untapped potential in franchises that have suffered by the lack of suitable hardware.
The PlayStation 4 and Xbox One are both successful because they're more alike than they are different. The Wii U went a totally different way, and as a result was difficult for developers to create games for, and it appears that the same will likely be true for the Switch. But no one should count on the lights being turned off at Nintendo - the company just needs to "switch" to more energy efficient bulbs.

Monday, August 14, 2017

Nintendo Switch Won’t be Copied, but Only Expects 50 Million Lifetime Sales

Infamous industry analyst Michael Pachter recently appeared in his regular video series the ‘Pachter Factor’, where he answers questions from fans about the gaming industry. In the latest episode, number 72, one of the questions that came up was related to the Nintendo Switch. Seeing that the system has had strong sales since it launched back in March, the person who asked the question to Pachter was wondering if this could entice either Sony or Microsoft to create their own hybrid devices, thus directly competing with the Switch.
Pachter immediately dismissed the idea of either company creating a hybrid system, noting that Sony has not found a lot of success in the portable market with the PSP and PS Vita both being trumped by their competitors. Microsoft has yet to release a portable system, and Pachter doesn’t expect them to make a move in that market anytime soon. He then shifted the focus to the beginning of the question where the Switch’s strong sales were mentioned.
Comparing the Switch to other handhelds, Pachter doesn’t expect it to do very well. His reason for this is viewpoint is the price, but he noted that it’s impossible to know until “supply and demand are in balance”. He pointed to Nintendo’s goal of selling 10 million Switch units this fiscal year, saying that “we’ll see”. Overall, he expects the Switch to sell about 50-70 million in its lifetime, but stands firm on the notion that price will ultimately be deciding factor:
“If they keep the price at $300, it will sell 50 (million). If they cut the price to $200 pretty quickly, it will sell 70 million. If they cut the price to $100, it will sell 90 million. But, at a price…everything is all based on supply and demand. If demand goes up, then price goes down. So, “success”, at $300 then there’s no way they’re selling more than 50 million—no chance."

My response to this is that Pachter is a bit off by trying to compare the Switch to other handhelds. I’ve emphasized this many times in the past—the Switch is a home console first-and-foremost. Nintendo itself classifies the Switch as a home console with the functionality to be transformed into a portable device. As a result, any comparisons that should be made needs to be against other home platforms, whether they be from Nintendo or its competitors.
On that note, if the Switch does sell 50 million like Pachter thinks it will, then that would basically tie it with the SNES as the third-best selling Nintendo home console ever. If the Switch hits 70 million, then that would put it higher than the original Nintendo Entertainment System, taking the second best-selling spot. In order to become the true best-selling Nintendo home console, the Switch would have to surpass the Wii’s massive achievement of over 100 million units sold. Some analysts have actually predicted that the Switch will actually do just so, but we’re just going to have to wait and see how it turns out.
As Pachter mentioned, trying to make a prediction right now is kind of difficult. The Switch is only a half-year old at this point, which is still far too early to determine its long-term success. Right now, it’s factually proven that the system is performing much better than the Wii U. Nintendo currently lists Switch lifetime sales at 4.7 million as of June 2017. It’s now the middle of August, so there’s a good chance that the Switch is somewhere in the realm of 5 million at this point. If Nintendo really does hit or even surpass its goal of selling 10 million units by March 2018, then the Switch would already match the Wii U’s lifetime sales of 14 million, which it took four years to amass.
While it’s currently impossible to predict how well the Switch will sell, it’s at least notable that Nintendo is pleased with its performance so far. Nvidia, who built the custom Tegra processor inside the Switch, is also pleased with the system’s strong sales. Even Sony has acknowledged that the Switch has been doing very well so far. All-in-all, things are looking pretty decent for the Switch in its current state, but we’ll just have to wait and see what the future really holds.

Wednesday, August 02, 2017

Top 50 horror films

The Shining
The Thing (1982)
The Mist
Silence of the Lambs
The Exorcist
The Omen (1976)
Let the Right One In
Bram Stoker’s Dracula
Silent Hill
The Descent
In the Mouth of Madness
The Ring
The Cell
The Evil Dead
Psycho (1960)
Dawn of the Dead (1978)
House on Haunted Hill (1999)
Event Horizon
Sleepy Hollow
The Re-Animator
Eden Lake
Creepshow 2
The Fly (1986)
28 Weeks Later
[REC 2]
The Wicker Man (1973)
The Devil’s Backbone
A Serbian Film
Skeleton Key
The Orphanage
Day of the Dead (1985)
Cabin by the Lake
The Texas Chain Saw Massacre
A Nightmare on Elm Street
Ghost Ship
The Hitcher (1986)

Top 50 Films of the 90s

1. Goodfellas
2. The Shawshank Redemption
3. Neon Genesis Evangelion end of Evangelion
4. Ninja Scroll
5. Ghost in the Shell
6. Macross Plus
7. The Matrix
8. Terminator 2: Judgement Day
9. Jurassic Park
10. Saving Private Ryan
11. Reservoir Dogs
12. The Silence of the Lambs
13. Fight Club
14. Leon: The Professional
15. Heat
16. Forrest Gump
17. Braveheart
18. The Lion King
19. Toy Story
20. Beauty and the Beast
21. The Lion King
22. Aladdin
23. Princess Mononoke
24. Unforgiven
25. American Beauty
26. Trainspotting
27. Good Will Hunting
29. Eyes Wide Shut
30. Independence Day
31. The Nightmare Before Christmas
32. The Fugitive
33. Batman Returns
34. Dazed and Confused
35. 10 Things I Hate About You
36. Clueless
37. Titanic
38. Romeo + Juliet
39. Goldeneye
40.. Leon: The Professional
41. Richard III
42. Tomorrow Never Dies
43. Rushmore
44. Wayne's World
45. Austin Powers; International Man of Mystery
46. Austin Powers: The Spy Who Shagged Me
47. Sling Blade
48. Wayne's World 2
49. The Fifth Element
50. A Few Good Men

Sunday, July 30, 2017

The Most Talked about game

A lot of exciting things have happened in the games industry since 2013. That time has seen the mobile game space rise to maturity; it's seen Sony return to console dominance with PS4, and Nintendo bounce from its greatest heights to its lowest ebb.
And yet one thing has stayed consistent throughout that entire four-year period. Through it all, Grand Theft Auto V has steadily, unstoppably continued to sell huge numbers every single week. In 2017 so far, it's the best-selling game in the UK; in the United States it charts in fourth place.
Previous entries in the Grand Theft Auto series were, of course, landmark titles in their own right - both culturally and commercially. Their content sparked controversy and, from the point when the series shifted into an extraordinary open world with Grand Theft Auto 3, their enormous sales pushed them into a mainstream consciousness that had generally glossed over videogames up to that point. Grand Theft Auto came to be the series that defined perceptions of games in the 2000s, perhaps even more so than Mario or Sonic had done in the 1990s.
"Never before has there been a game like GTAV, which has served as a touchstone for an entire era of gaming"
Grand Theft Auto V, however, has quietly gone beyond that and become something even more. I say quietly, because it's not necessarily something that you see if you're an ordinary game consumer. For most of us, Grand Theft Auto V was a game - a really great, beautifully made, fantastic game - that we played for a pretty long time a few years ago. We've moved on, though sometimes it comes up in conversation, or you see a really crazy stunt video on YouTube; it's part of gamer consciousness, but arguably no more than a number of other superb games of the same era.
Yet unlike all those other games, GTAV keeps on selling. People keep walking into shops and buying it; 340,000 copies in the UK alone this year. The only way to explain those sales is to assume that they are representative of GTAV being purchased along with, or soon after, the upgrades being made by many consumers to next-gen consoles or higher spec PCs. Far more than its predecessors, the game has become a cultural touchstone - something that you simply buy by default along with a new game system.
Of course, individual game consoles have had must-own games before; how many people bought Halo with the original Xbox, or Mario 64 with the Nintendo 64? Never before, however, has there been a game like GTAV, which has served as a touchstone for an entire era of gaming. The closest point of comparison I can think of is something like The Matrix, which was the go-to DVD for people buying new DVD players in the late 1990s, or Blade Runner's Directors' Cut, which served a similar role for Blu-Ray. Nothing before now in the realm of videogames comes close.
"Something we don't know, however, is what people are actually doing with those new copies of GTAV"
Something we don't know, however, is what people are actually doing with those new copies of GTAV; the huge question is whether they're buying them for the game's excellent single-player experience, or whether they're diving into GTA Online. The online game has been a runaway success for publisher Take Two, and has definitely helped to prolong the longevity of GTAV, but it's hard to quantify just how much it has to do with the continued strong sales of the game itself.
That question is important, because if people are primarily buying GTAV as an online game, it makes it a little easier to categorise that success. In that case, it would belong alongside titles like League of Legends, World of Warcraft or Destiny; enormous, sprawling games that suck up years upon years of players' attention.
From a commercial standpoint, the industry is still a little unsure what these games are or what to do about them; they are behemoths on the landscape that everyone else needs to navigate around, but while many people share an intuition that they collapse revenues for other games in the same genre, it's not entirely clear as yet what influence they really have on everything else on the market. If GTAV fits in with those titles, albeit on a level of its own to some degree, then it makes sense; it fits a pattern.
"GTAV became embedded in our collective consciousness until it was The Game You Buy When You Finally Get A PS4"
My sense, however, is that GTAV is something entirely different. It's not quite, as Take-Two CEO Strauss Zelnick rather bombastically claimed at E3, that there are no "other titles... clustered around GTA from a quality point of view." GTAV is a brilliant game, but it's hard to support the claim that there's nothing else out there of similar quality.
Rather, it's that GTAV has struck a series of notes perfectly, stitching together a combination of elements each of which is executed flawlessly and which combined to make a game that is memorable, replayable, funny, challenging, and - vitally in this era - a never-ending source of entertaining video clips for YouTube or Twitch. Almost every aspect of GTAV is good, but there's no single part you can point to and say, "this is why this is the game that defines an era." The magic lies in the sum, not the individual parts.
And perhaps it's something more than even that; perhaps GTAV isn't just the right game, it's also a game that's appeared at the right time.
Think of the average age of a game consumer, which is well into the thirties at this point. Think of how games have come to be a part of our cultural conversation; no longer in a dismissive way, but as a field of genuine interest, a source of inspiration for other media, a topic of watercooler conversation. Think too of how videogames have begun to inform the aesthetics of the world, from the gloss of Marvel's movies to the more obvious homages of Wreck-It Ralph or (god help us) Pixels. Somehow they've even managed to rope Spielberg into adapting inexplicably popular execrable teenage gamer fanfiction novel Ready Player One. Games are embedded as part of the world's culture and, more importantly, part of how we talk about that culture.
GTAV arrived, in stunning, endlessly discussable, endlessly uploadable form right at the moment when that transition was being completed. There's no way to quantify this, but I'll wager GTAV holds a special record that'll never go in Guinness' book. I'll wager it's the most talked-about game of all time. Not because of controversy or scandal; it's a game that's just been talked about in conversation after conversation, four years of discussing stunts and jokes and achievements and easter eggs, until the game became embedded in our collective consciousness until it was The Game You Buy When You Finally Get A PS4.
There's never been a game that occupied a place in the public consciousness quite like GTAV; but now that such a place exists for games in our collective cultural consciousness, perhaps it won't be very long before more fantastic games roll up to take on similar roles.

Saturday, July 22, 2017

Was NASA Technology Predicted in Ancient Indian Writings?

Is it possible that ancient cultures 7000 years ago knew how to create flying machines to traverse the sky and beyond using a technology that NASA engineers are still trying to harness today?

The first artificial satellite launched famously into orbit was the Russian satellite Sputnik, in 1957. Prior to this, rockets had been used to launch missiles for warfare. The first rocket able to fly high enough to get into space was the German A4/V-2 rocket family launched in 1942. Considering early powered flight and early models of the aeroplane these advances still only date back to the beginning of the 20th century. However there are many books and websites which forcefully and passionately assert that technologically advanced aircraft and spacecraft were in common use over the Indian subcontinent thousands of years ago. These same sources claim that advanced space propulsion techniques being researched by NASA are in fact directly inspired by ancient flying machines.

Flying Vimanas, is this inspiration for current NASA technology? Credit: Wikimedia Commons.
Flying Vimanas, is this inspiration for current NASA technology? Seriously?  (Image credit: Wikimedia Commons)

Could these amazing stories be true? The answer will be obvious to anyone with any knowledge of science or history but let  us try  to  look at the claims as fairly as we can.
In Hindu mythology, demon King Ravana of Lanka (ancient Sri Lanka) brought to the country something called the Pushpaka Vimãna. Vimãna literally means to traverse or to measure. Other uses of the word refer to a temple or a flying machine. The Pushpaka Vimãna is thought to be King Ravana’s flying palace which was shaped like a giant peacock. Every country has their myths and legends. In Ireland, Finn McCool is a familiar character in ancient stories. So where does fiction meet fact? How is this ancient flying machine relevant to today?
Ten headed King ravana, inventor of ancient flying machines? Credit: Wikimedia Commons.
Ten-headed King Ravana, inventor of ancient flying machines? (Image credit: Wikimedia Commons)

Some say that “ancient scriptures” tell of these ancient aeronauts, but there is no record of these documents prior to the last century. A publication called the Vaimãnika Shastra or the Science of Aeronautics, is a 20th century text written by one Subbaraya Shastry sometime between 1918-1923. However the information provided in this book is believed to have been obtained by psychic channelling with the ancient Saint Bharadvaja. Subbaraya Shastry was believed to have contracted leprosy. He left his home and spent nine years living in the forest. During this time he is supposed to have spoken with the ancient saint (sage Bharadvaja) and was enlightened with this new found knowledge of flying machines. He later returned home (as he also had been cured of leprosy), but Shastry could not read or write so he dictated his new knowledge over the period of 5 years,( 25 years after the psychic experience itself).   The dictated text was apparently discovered in 1952 by G.R Josyer who later translated it into English in 1973. This publication contains eight chapters claiming that ancient vimãnas from the King Ravana legend were actually feasible flying machines, perhaps even similar in ability to rockets. The text indicates that propulsion was provided using rotating gyroscopes of electricity and mercury.
The existing text is said to only be a small section of larger (lost) works. It aims to provide information to pilots on the secrets of hearing and destroying enemy planes at the same time remaining motionless, unbreakable and invisible. According to American author David Hatcher Childress (born 1957), another ancient Indian piece of work that he decoded, Samarangana Sutradhara reveals that these ancient flying vimãnas, referred to in ‘Vaimãnika Shastra’ were powered by the metal mercury. He states in the text is says “By means of the power latent in the mercury which sets the driving whirlwind in motion, a man sitting inside may travel a great distance in the sky…” This powerplant is referred to as a “mercury vortex engine”.
The ‘Vaimãnika Shastra’ although dubbed an ancient text in reality was written less than 100 years ago. It was supposedly dictated by psychically channelling an ancient saint; however it was written almost 25 years after the author actually had the experience. The author was believed to be a leper, spending years in isolation.  Coincidentally in history, a treatment solution for leprosy was mercury. Perhaps this was inspiration for the fuel of the vimãnas? American author David Hatcher Childress has published over 200 books mostly on unusual topics such as ancient astronauts and the lost city of Atlantis, however Childress’ works have been criticised by historical archaeologists for being factually incorrect. He is also the owner of a publishing house and no doubt mysterious topics gather more interest in sales.
A study by the aeronautical and mechanical engineering department at the Indian Institute of Science in 1974, referred to the crafts as “poor concoctions” and the crafts themselves were unfeasible for flight. The text also does not explain how the vimãnas actually get up into the air and the information appears as though the author has knowledge of some modern machinery (that is to say modern for the early 1900s), yet there appears to be little to no understanding of aeronautics. The illustrations provided were based upon the text, and are more comparable to steampunk flying machines. So despite the fact that these creations are masquerading as ancient designs, in truth this is far from the reality.
Image: NASA ion engine testing for Deep Space Craft. Credit: NASA
NASA ion engine testing for deep space craft. (Image credit: NASA)

NASA (and other research organisations) have been experimenting with the ion propulsion concept since the 1950s.  In an ion propulsion engine, the gas is held in a chamber surrounded by magnets. An electrical charge causes the atoms to lose electrons, therefore turning them into ions. As the ionised gas jet is expelled out of the craft, it is then propelled in the opposite direction.In the 1970s when this technology was being trialled in earthbound laboratories, mercury or caesium was used as propellant. Mercury is liquid at room temperature and caesium is a solid. Therefore, they were both relatively easy to store. In order to be used as a propellant both elements needed to be heated to become a gas.

Image: Artist’s impression of the Dawn space craft travelling to asteroid Vesta (left) and Ceres (right). Credit: Wikimedia
Artist’s impression of the Dawn space craft travelling to asteroid Vesta (left) and Ceres (right). (Image credit:

The problems arose when some of the gases condensed and leaked onto the ground. Caesium is radioactive, can be corrosive and is very dangerous. Mercury is also a neurotoxin and is thought to be one of the more dangerous metals and exposure to it can cause various health problems. After determining that both these elements were much too toxic and dangerous to use, scientists turned to using Xenon, an odourless, colourless gas which is generally unreactive. This futuristic idea was used on the Dawn spacecraft. This craft was launched in 2007 and travelled towards the asteroid belt using ion propulsion engines visiting Vesta. It is intended to reach the dwarf planet Ceres located in the asteroid belt in February of 2015. These experiments by NASA in the 1970s may have been the inspiration for the legend that NASA is trying to build space vehicles pushed along by the  mercury vortex engines of legend.

The Dawn spacecraft with the dwarf planet Ceres. Ion propulsion takes a little longer than other methods to manoeuvre into orbit: Credit: NASA/JPL.
A stylised artist’s impression of the Dawn spacecraft with Ceres (created before the mission reached the dwarf planet). Ion propulsion takes a little longer than other methods to manoeuvre into orbit. (image credit: NASA/JPL)

So despite claims that ancient scholars were aware of flying machines that used mercury as a propellant, this not something which NASA are harnessing today.  Although ion engines using mercury were tested in the past they are not in use now. Equally the ancient text explaining the science of aeronautics is really not that ancient at all as it was written less than 100 years ago and doesn’t actually explain the science of how to actually get into the air. In terms of myths and legends the story of the flying peacock seems like a great one but in reality there are no mercury vortex engines and especially not ones from the era of King Ravana.
(Article by Martina Redpath, Senior Education Support Officer)
  • We are very happy to see your comments, but please be civil and polite.
  • Comments about individuals, ethnic groups, nations or religious groups judged to be abusive or insulting will not be published.
  • The movie The Objective (2008) which deals with US soldiers encountering vimanas in Afghanistan is fiction and not a documentary. It in no way constitutes evidence of ancient technology.

Saturday, July 08, 2017

Wright-Patterson AFB and Alien Technology

Since 1947, the year of the famous Roswell crash, there have been rumors that the US government has stored debris and artifacts from crashed flying saucers, and even bodies of the small, alien crew members of the downed space ships. Much of the evidence of these crash retrievals leads to Dayton, Ohio, and Wright-Patterson's Hangar-18. How much of the legend surrounding the famous Wright-Patterson facility is true?
Are there still alien beings... even possibly live beings, from other worlds at the infamous base in Dayton, Ohio?

History of Wright-Patterson Air Force Base

Originally called Wilbur Wright Field, the government installation was first opened in 1917 to train military personnel during the first World War. Soon, Fairfield Air Depot was created adjacent to Wright Field. In 1924, McCook Field test facility was closed, and the Dayton community purchased 4,500 acres which housed the various facilities. This took in the previously leased land of Wright Field, and the Wright and Fairfield facilities were combined into one. The newly created facility was named after the innovators of flight, the Wright Brothers.
On July 6, 1931, the area east of Huffman Dam, which included Wilbur Wright Field, Fairfield Air Depot, and the Huffman Prairie was renamed Patterson Field. This was to honor the memory of Lt.
Frank Stuart Patterson. Patterson died in 1918, when a plane he was flying a test in, crashed after its wings separated from the craft. In 1948, the fields were merged under one name, Wright-Patterson AFB.

Testing New Technology at Wright-Patterson AFB

Wright-Patterson is instrumental in testing new weapon technology, along with research and development, education, and many other defense related operations.
It is the home of the Air Force Institute of Technology, supporting the Air Force and the Department of Defense. The USAF's National Air & Space Intelligence Center is also part of Wright-Patterson.

Reverse Engineering Foreign Technology at Wright-Patterson

The base was well known for reverse engineering of foreign government aircraft during the Cold War. The base expertise in the disassembly and recreation of MIG fighters has only enriched the theories that alien craft have been studied there. The workforce is estimated at 22,000, giving us an idea of the massive amount of work being done at the base.

Roswell and Alien Craft Retrievals

Wright-Patterson is most well known for its connection to the Roswell crash, although links can be made to other crash retrievals. Several eyewitness accounts of military personnel and even civilian workers who handled debris from the Roswell crash and saw bodies of creatures not of our world give us a very plausible Wright-Patterson connection to the study of alien technology and physiology.
The same day that the famous Roswell headlines ran in newspapers around the world, there was an enormous amount of activity at the Roswell base. Some debris from the crash, and possibly alien bodies were sent to Ft.
Worth, Texas. It is now commonly accepted by researchers that before the Ft. Worth flight, another flight to Wright-Patterson had already taken place, carrying debris and alien bodies. This shipment was secretly stored and studied in the infamous Hangar-18.

Were Aliens and Their Technology Stored and Studied in Hangar-18?

UFO researcher Thomas J. Carey, coauthor of "Witness to Roswell," states that: "We believe some of the stuff was loaned around, but the main repository was the foreign technology division at Wright-Patterson. "We've heard stories over the years of people who say that they're still trying to figure out what that stuff is."
Could this alien material and technology be so advanced, that even after many years of study by our best scientists, they still fall short in understanding the secrets behind it?
If scientists could have unlocked even some of the technological advancements of the ship's inner workings and navigational systems, could it not have been the creative impetus behind the Stealth series of aircraft, and the seemingly rapid advancement of our weapons technology in the last 50-plus years?

Eyewitnesses and Investigators

Much of the eyewitness testimony concerning the secrets of Wright-Patterson comes to us from military personnel, children of eyewitnesses, close friends, and coworkers of those intimately involved with crash debris wreckage and/or alien bodies. Some of these stories have only emerged in the last few years.
A Canadian Ufologist related the following account. He received it first hand from a gentleman whose father served at Roswell. The man's story begins in 1957. He and his father went to see the sci-fi classic, "Earth vs. The Flying Saucers." After the movie had ended, they began their journey home. As they drove along, he noticed that his father was uncharacteristically quiet. Finally, the silence was broken when his father said, "They were too big." This was obviously a reference to the aliens depicted in the film.
The man's father then told his long kept secret. In 1947, he had been stationed at Wright Field. He was a member of a film unit there. One day, he and a fellow worker were summoned by an officer to get their 16mm movie cameras and follow him. The two workers were led by the officer to a heavily guarded airplane hanger, more than likely Hangar-18, although the man's father did not say. Inside the hanger, they were shocked to see a badly damaged, circular space craft. There was debris from the UFO wreckage scattered over a large area, on a canvas tarp. The officer instructed the two cameramen to take film of anything and everything in sight. The two men discharged their duties in due fashion.
Upon finishing this first assignment, they were then summoned to the very rear of the hanger. They were taken inside of a refrigeration unit there.
The man's father told his son that he was stunned to see two storage bins which held the bodies of two small alien creatures! The beings were very thin, gray in color, with large eyes, but no eyelids. One of these beings had obviously suffered bodily damage, while the other showed no apparent signs of injury.

Friday, July 07, 2017

Mach Effect

The Woodward effect, also referred to as a Mach effect, is part of a hypothesis proposed by James F. Woodward in 1990.[1] The hypothesis states that transient mass fluctuations arise in any object that absorbs internal energy while undergoing a proper acceleration. Harnessing this effect could generate a reactionless thrust, which Woodward and others claim to measure in various experiments.[2][3] If proven to exist, the Woodward effect could be used in the design of spacecraft engines of a field propulsion engine that would not have to expel matter to accelerate. Such an engine, called a Mach effect thruster (MET) or a Mach Effect Gravitational Assist (MEGA) drive, would be a breakthrough in space travel.[4][5] So far, no conclusive proof of the existence of this effect has been presented.[6] Experiments to confirm and utilize this effect by Woodward and others continue.[7] The anomalous thrust detected in some RF resonant cavity thruster (EmDrive/Cannae drive) experiments may be explained by the same type of Mach effect proposed by Woodward.[8][9][10]
The Space Studies Institute was selected as part of NASA's Innovative Advanced Concepts program as a Phase I proposal in April of 2017 for Mach Effect research.[11][12][13][14]

Mach effects

According to Woodward, at least three Mach effects are theoretically possible: vectored impulse thrust, open curvature of spacetime, and closed curvature of spacetime.[15]
The first effect, the Woodward effect, is the minimal energy effect of the hypothesis. The Woodward effect is focused primarily on proving the hypothesis and providing the basis of a Mach effect impulse thruster. In the first of three general Mach effects for propulsion or transport, the Woodward effect is an impulse effect usable for in-orbit satellite station-keeping, spacecraft reaction control systems, or at best, thrust within the solar system. The second and third effects are open and closed spacetime effects. Open curved spacetime effects can be applied in a field generation system to produce warp fields. Closed-curve spacetime effects would be part of a field generation system to generate wormholes.[citation needed]
The third Mach effect is a closed-curve spacetime effect or closed timelike curve called a benign wormhole. Closed-curve space is generally known as a wormhole or black hole. Prompted by Carl Sagan for the scientific basis of wormhole transport in the movie Contact, Kip Thorne[16] developed the theory of benign wormholes. The generation, stability, and traffic control of transport through a benign wormhole is only theoretical at present. One difficulty is the requirement for energy levels approximating a "Jupiter size mass".
Kenneth Nordtvedt showed in 1988 that gravitomagnetism, which is an effect predicted by general relativity but hasn't been observed yet at that time and was even challenged by the scientific community, is inevitably a real effect because it is a direct consequence of the gravitational vector potential. He subsequently shown that the gravitomagnetism interaction (not to be confused with the Nordtvedt effect), like inertial frame dragging and the Lense–Thirring precession, is typically a Mach effect.[17]


Mach's principle

The Woodward effect is based on the relativistic effects theoretically derived from Mach's principle on inertia within general relativity, attributed by Albert Einstein to Ernst Mach.[18] Mach's Principle is generally defined as "the local inertia frame is completely determined by the dynamic fields in the Universe."[19] The conjecture comes from a thought experiment:[20]
Ernst Mach (1838–1916) was an Austrian physicist […] contemporary of Einstein, to whom he suggested a thought experiment: What if there was only one object in the universe? Mach argued that it could not have a velocity, because according to the theory of relativity, you need at least two objects before you can measure their velocity relative to each other.
Taking this thought experiment a step further, if an object was alone in the universe, and it had no velocity, it could not have a measurable mass, because mass varies with velocity.
Mach concluded that inertial mass only exists because the universe contains multiple objects. When a gyroscope is spinning, it resists being pushed around because it is interacting with the Earth, the stars, and distant galaxies. If those objects didn't exist, the gyroscope would have no inertia.
Einstein was intrigued by this concept, and named it "Mach's principle."

Gravity origin of inertia

A formulation of Mach's principle was first proposed as a vector theory of gravity, modeled on Maxwell's formalism for electrodynamics, by Dennis Sciama in 1953,[21] who then reformulated it in a tensor formalism equivalent to general relativity in 1964.[22]
In this paper, Sciama stated that instantaneous inertial forces in all accelerating objects are produced by a primordial gravity-based inertial radiative field created by distant cosmic matter and propagating both forwards and backwards in time at light speed:
Inertial forces are exerted by matter, not by absolute space. In this form the principle contains two ideas:
  1. Inertial forces have a dynamical rather than a kinematical origin, and so must be derived from a field theory [or possibly an action-at-a-distance theory in the sense of J.A. Wheeler and R.P. Feynman…
  2. The whole of the inertial field must be due to sources, so that in solving the inertial field equations the boundary conditions must be chosen appropriately.
    — Dennis W. Sciama, in "The Physical Structure of General Relativity", Reviews of Modern Physics (1964).
Sciama's inertial-induction idea has been shown to be correct in Einstein's general relativity for any Friedmann–Robertson–Walker cosmology.[23][24] According to Woodward, the derivation of Mach effects is relativistically invariant, so the conservation laws are satisfied, and no "new physics" is involved besides general relativity.[25]

Gravitational absorber theory

As previously formulated by Sciama, Woodward suggests that the Wheeler–Feynman absorber theory would be the correct way to understand the action of instantaneous inertial forces in Machian terms.[26][27][28]
A first image to understand would be filming a sequence where a rock is thrown in the middle of a pond, making concentric ripples on the water propagating towards the shore.
Running the sequence backwards (thinking it as seeing events running backward in time) we then observe concentric waves propagating from the shore towards the center of the pond, where a rock emerges.
The thing to understand is that advanced waves coming back from the future never propagate farther into the past than the rock hitting the water that initiated all of the waves.
— James F. Woodward, in Making Starships and Stargates, Springer 2013, page 49.[15]
The Wheeler-Feynman absorber theory is an interpretation of electrodynamics that starts from the idea that a solution to the electromagnetic field equations has to be symmetric with respect to time-inversion, as are the field equations themselves.[21][29] Wheeler and Feynman showed that the propagating solutions to classical wave equations can either be retarded (i.e. propagate forward in time) or advanced (propagate backward in time). The absorber theory has been used to explain quantum entanglement and led to the transactional interpretation of quantum mechanics,[30][31][32] as well as the Hoyle-Narlikar theory of gravity, a Machian version of Einstein's general relativity.[33] Fred Hoyle and Jayant Narlikar originally developed their cosmological model as a quasi steady state model of the universe, adding a "Creation field" generating matter out of empty space, an hypothesis contradicted by recent observations.[34] When the C-field is not used, ignoring the parts regarding mass creation, the theory is no longer steady state and becomes a Machian extension of general relativity. This modern development is known as the Gravitational Absorber Theory.[35]
As the gravitational absorber theory reduces to general relativity in the limit of a smooth fluid model of particle distribution,[36] both theories make the same predictions. Except in the Machian approach, a mass changing effect emerges from the general equation of motion, from which Woodward's transient mass equation can be derived.[37] A resulting force suitable for Mach effect thrusters can then be calculated.[38]
While the Hoyle-Narlikar derivation of the Mach effect transient terms is done from a fully nonlinear, covariant formulation, it has been shown Woordward's transient mass equation can also be retrieved from linearized general relativity.[39][40]

Transient mass fluctuation

The following has been detailed by Woodward in various peer-reviewed papers throughout the last twenty years.[41][42][43]
According to Woodward, a transient mass fluctuation arises in an object when it absorbs "internal" energy as it is accelerated. Several devices could be built to store internal energy during accelerations. A measurable effect needs to be driven at a high frequency, so macroscopic mechanical systems are out of question since the rate at which their internal energy could be modified is too limited. The only systems that could run at a high frequency are electromagnetic energy storage devices. For fast transient effects, batteries are ruled out. A magnetic energy storage device like an inductor using a high-permeability core material to transfer the magnetic energy could be especially built. But capacitors are preferable to inductors because compact devices storing energy at a very high energy density without electrical breakdown are readily available. Shielding electrical interferences are easier than shielding magnetic ones. Ferroelectric materials can be used to make high-frequency electro-mechanical actuators, and they are themselves capacitors so they can be used for both energy storage and acceleration. Finally, capacitors are cheap and available in various configurations. So Mach effect experiments have always relied on capacitors so far.
When the dielectric of a capacitor is submitted to a varying electric power (charge or discharge), Woodward's hypothesis predicts[43] a transient mass fluctuation arises according to the transient mass equation (TME):
{\displaystyle \delta m_{0}={\frac {1}{4\pi G}}\left[{\frac {1}{\rho _{0}c^{2}}}{\frac {\partial P}{\partial t}}-\left({\frac {1}{\rho _{0}c^{2}}}\right)^{2}{\frac {P^{2}}{V}}\right]}
  • m_{0} is the proper mass of the dielectric,
  • G is the gravitational constant,
  • c is the speed of light in vacuum,
  • \rho _{0} is the proper density of the dielectric,
  • V is the volume of the dielectric,
  • P is the instantaneous power delivered to the system.
This equation is not the full Woodward equation as seen in the book. There is a third term, {\displaystyle \scriptstyle -\left({\frac {\partial }{\partial t}}{\frac {\phi }{c^{2}}}\right)^{2}}, which Woodward discounts because his gauge sets {\displaystyle \scriptstyle {\frac {\phi }{c^{2}}}\approx 1}; the derivatives of this quantity must therefore be negligible.[43]

Propellantless propulsion

The previous equation shows that when the dielectric material of a capacitor is cyclically charged then discharged while being accelerated, its mass density fluctuates, by around plus or minus its rest mass value. Therefore, a device can be made to oscillate either in a linear or orbital path, such that its mass density is higher while the mass is moving forward, and lower while moving backward, thus creating an acceleration of the device in the forward direction, i.e. a thrust. This effect, used repeatedly, does not expel any particle and thus would represent a type of apparent propellantless propulsion, which seems to be in contradiction with Newton's third law of motion. However, Woodward states there is no violation of momentum conservation in Mach effects:[41]
If we produce a fluctuating mass in an object, we can, at least in principle, use it to produce a stationary force on the object, thereby producing a propulsive force thereon without having to expel propellant from the object. We simply push on the object when it is more massive, and pull back when it is less massive. The reaction forces during the two parts of the cycle will not be the same due to the mass fluctuation, so a time-averaged net force will be produced. This may seem to be a violation of momentum conservation. But the Lorentz invariance of the theory guarantees that no conservation law is broken. Local momentum conservation is preserved by the flux of momentum in the gravity field that is chiefly exchanged with the distant matter in the universe. [emphasis added]
Two terms are important for propulsion on the right-hand side of the previous equation:
  • The first, linear term {\displaystyle \scriptstyle {\frac {1}{\rho _{0}c^{2}}}{\frac {\partial P}{\partial t}}} is called the impulse engine term because it expresses mass fluctuation depending on the derivative of the power, and scales linearly with the frequency. Past and current experiments about Mach effect thrusters are designed to demonstrate thrust and the control of one type of Mach effect.
  • The second, quadratic term {\displaystyle \scriptstyle -\left({\frac {1}{\rho _{0}c^{2}}}\right)^{2}{\frac {P^{2}}{V}}} is what Woodward calls the wormhole term, because it is always negative. Although this term appears to be many orders of magnitude weaker than the first term, which makes it usually negligible, theoretically, the second term's effect could become huge in some circumstances. The second term, the wormhole term, is indeed driven by the first impulse engine term, which fluctuates mass by around plus or minus the rest mass value. When fluctuations reach a very high amplitude and mass density is driven very close to zero, the equation shows that mass should achieve very large negative values very quickly, with a strong non-linear behavior. In this regard, the Woodward effect could generate exotic matter, although this still remains very speculative due to the lack of any available experiment that would highlight such an effect.
Applications of propellantless propulsion include straight-line thruster or impulse engine, open curved fields for starship warp drives, and even the possibility of closed curved fields such as traversable benign wormholes. [44]

Negative bare mass of the electron

The mass of the electron is positive according to the mass–energy equivalence E = mc2 but this invariant mass is made from the bare mass of the electron "clothed" by a virtual photon cloud. According to quantum field theory, as those virtual particles have an energy more than twice the bare mass of the electron, mandatory for pair production in renormalization, the nonelectromagnetic bare mass of the "unclothed" electron has to be negative.[45]
Using the ADM formalism, Woodward proposes that the physical interpretation of the "wormhole term" in his transient mass equation could be a way to expose the negative bare mass of the electron, in order to produce large quantities of exotic matter that could be used in a warp drive to propel a spacecraft or generate traversable wormholes.[46]

Space travel

Current spacecraft achieve a change in velocity by the expulsion of propellant, the extraction of momentum from stellar radiation pressure or the stellar wind or the utilisation of a gravity assist ("slingshot") from a planet or moon. These methods are limiting in that rocket propellants have to be accelerated as well and are eventually depleted, and the stellar wind or the gravitational fields of planets can only be utilized locally in the Solar System. In interstellar space and bereft of the above resources, different forms of propulsion are needed to propel a spacecraft, and they are referred to as advanced or exotic.[47][48]

Impulse engine

If the Woodward effect is confirmed and if an engine can be designed to use applied Mach effects, then a spacecraft may be possible that could maintain a steady acceleration into and through interstellar space without the need to carry along propellants. Woodward presented a paper about the concept at the NASA Breakthrough Propulsion Physics Program Workshop conference in 1997,[49][50] and continued to publish on this subject thereafter.[51][52][53][54]
Even ignoring for the moment the impact on interstellar travel, future spacecraft driven by impulse engines based on Mach effects would represent an astounding breakthrough in terms of interplanetary spaceflight alone, enabling the rapid colonization of the entire solar system. Travel times being limited only by the specific power of the available power supplies and the acceleration human physiology can endure, they would allow crews to reach any moon or planet in our solar system in less than three weeks. For example, a typical one-way trip at an acceleration of 1 g from the Earth to the Moon would last only about 4 hours; to Mars, 2 to 5 days; to the asteroid belt, 5 to 6 days; and to Jupiter, 6 to 7 days.[55]

Warp drives and wormholes

As shown by the transient mass fluctuation equation above, exotic matter could be theoretically created. A large quantity of negative energy density would be the key element needed to create warp drives[56] as well as traversable wormholes.[57] As such, if proven to be scientifically valid, practically feasible and scaling as predicted by the hypothesis, the Woodward effect could not only be used for interplanetary travel, but also for apparent faster-than-light interstellar travel:

Patents and practical devices

Two patents have been issued to Woodward and associates based on how the Woodward effect might be used in practical devices for producing thrust:
  • In 1994, the first patent was granted, titled: "Method for transiently altering the mass of objects to facilitate their transport or change their stationary apparent weights".[59]
  • In 2002, a second patent was granted, titled: "Method And Apparatus For Generating Propulsive Forces Without The Ejection Of Propellant".[60]
  • In 2016, a third patent was granted and assigned to the Space Studies Institute, covering the realistic realizations of Mach effects.[61]
Woodward and his associates have claimed since the 1990s to have successfully measured forces at levels great enough for practical use and also claim to be working on the development of a practical prototype thruster. No practical working devices have yet been publicly demonstrated.[2][3][6][41]
The NIAC contract awarded in 2017 by NASA for the development of Mach effect thrusters is a primary three-task effort, two experimental and one analytical:[12]
  1. Improvement of the current laboratory-scale devices, in order to provide long duration thrust at levels required for practical propulsion applications.
  2. Design and development of a power supply and electrical systems to provide feedback and control of the input AC voltage, and resonant frequency, that determine the efficiency of the MET.
  3. Improve theoretical thrust predictions and build a reliable model of the device to assist in perfecting the design. Predict maximum thrust achievable by one device and how large an array of thrusters would be required to send a probe, of size 1.5 m diameter by 3 m, of total mass 1245 kg including a modest 400 kg of payload, a distance of 8 light-years away.


Test devices

Mach-Lorentz Thruster

Photograph of the 2006 Woodward effect MLT test article.
A former type of Mach effect thruster was the Mach-Lorentz thruster (MLT). It used a charging capacitor embedded in a magnetic field created by a magnetic coil. A Lorentz force, the cross product between the electric field and the magnetic field, appears and acts upon the ions inside the capacitor dielectric. In such electromagnetic experiments, the power can be applied at frequencies of several megahertz, unlike PZT stack actuators where frequency is limited to tens of kilohertz. The photograph shows the components of a Woodward effect test article used in a 2006 experiment.[62]
However, a problem with some of these devices was discovered in 2007 by physicist Nembo Buldrini, who called it the Bulk Acceleration Conjecture:
What [Nembo Buldrini] pointed out was that given the way the transient terms of the Mach effect equation are written – in terms of the time-derivatives of the proper energy density – it is easy to lose sight of the requirement in the derivation that the object in which the mass fluctuations occur must be accelerating at the same time. In some of the experimental cases, no provision for such "bulk" acceleration was made.15 As an example, the capacitors affixed to the tines of the tuning fork in the Cramer and the students' experiments made no provision for such an acceleration. Had the tuning fork been separately exited and an electric field applied to the capacitor(s) been properly phased, an effect might have been seen. But to simply apply a voltage to the capacitors and then look for a response in the tuning fork should not have been expected to produce a compelling result.
Other examples could be cited and discussed. Suffice it to say, though, that after Nembo focused attention in the issue of bulk accelerations in the production of Mach effects, the design and execution of experiments changed. The transition to that work, and recent results of experiments presently in progress, are addressed in the next chapter.
15 By "bulk" acceleration we are referring to the fact that the conditions of the derivation include that the object be both accelerated and experience internal energy changes. The acceleration of ions in the material of a capacitor, for example, does not meet this condition. The capacitor as a whole must be accelerated in bulk while it is being polarized.
— James F. Woodward, in Making Starship and Stargates, Springer 2013, page 132.[15]

Mach Effect Thruster or MEGA drive

To address this issue, Woodward started to design and build a new kind of device known as a MET (Mach Effect Thruster) and later a MEGA drive (Mach Effect Gravitational Assist drive), using capacitors and a series of thick PZT disks. This ceramic is piezoelectric, so it can be used as an electromechanical actuator to accelerate an object placed against it: its crystalline structure expands when a certain electrical polarity is applied, then contracts when the opposite field is applied, and the stack of discs vibrates.
In the first tests, Woodward simply used a capacitor between two stacks of PZT disks. The capacitor, while being electrically charged to change its internal energy density, is shuttled back and forth between the PZT actuators. Piezoelectric materials can also generate a measurable voltage potential across their two faces when pressed, so Woodward first used some small portions of PZT material as little accelerometers put on the surface of the stack, to precisely tune the device with the power supply. Then Woodward realized that PZT material and the dielectric of a capacitor were very similar, so he built devices that are made exclusively of PZT disks, without any conventional capacitor, applying different signals to different portions of the cylindrical stack. The available picture taken by his graduate student Tom Mahood in 1999 shows a typical all-PZT stack with different disks:[63]
  • The outer, thicker disks on the left and right are the "shuttlers".
  • The inner stack of thin disks in the center are the shuttled capacitors storing energy during acceleration, where any mass shift would occur.
  • The even thinner disks placed between the shuttlers and on both side of the inner disk capacitors are the "squeezometers' acting as accelerometers.
During forward acceleration and before the transient mass change in the capacitor decays, the resultant increased momentum is transferred forward to a bulk "reaction mass" through an elastic collision (the brass end cap on the left in the picture). Conversely, the following decrease in the mass density takes place during its backward movement. While operating, the PZT stack is isolated in a Faraday cage and put on a sensitive torsion arm for thrust measurements, inside a vacuum chamber. Throughout the years, a wide variety of different types of devices and experimental setups have been tested. The force measuring setups have ranged from various load cell devices to ballistic pendulums to multiple torsion arm pendulums, in which movement is actually observed. Those setups have been improved against spurious effects by isolating and canceling thermal transfers, vibration and electromagnetic interference, while getting better current feeds and bearings. Null tests were also conducted.[64]
In the future, Woodward plans to scale thrust levels, switching from the current piezoelectric dielectric ceramics (PZT stacks) to new high-κ dielectric nanocomposite polymers, like PMN, PMN-PT or CCTO. Nevertheless, such materials are new, quite difficult to find, and are electrostrictive, not piezoelectric.[65][66]
In 2013, the Space Studies Institute announced the Exotic Propulsion Initiative, a new project privately funded that aims to replicate Woodward's experiments and then, if proven successful, fully develop exotic propulsion.[67] Gary Hudson, president and CEO of SSI, presented the program at the 2014 NASA Institute for Advanced Concepts Symposium.[68]


Another type of claimed propellantless thruster, called the EmDrive by its inventor British engineer Roger Shawyer, has been proposed to work due to a Mach effect:[8][9]
The asymmetric resonant microwave cavity would act as a capacitor where:
  • surface currents propagate inside the cavity on the conic wall between the two end plates,
  • electromagnetic resonant modes create electric charges on each end plate,
  • a Mach effect is triggered by Lorentz forces from surface currents on the conic wall,
  • a thrust force arises in the RF cavity, due to the variation of the electromagnetic density from evanescent waves inside the skin layer.
When a polymer insert is placed asymmetrically in the cavity, its dielectric properties result in greater asymmetry, while decreasing the cavity Q factor. The cavity's acceleration is a function of all the above factors, and the model can explain the acceleration of the cavity with and without a dielectric.[10]


From his initial paper onward Woodward has claimed that this effect is detectable with modern technology.[1] He and others have performed and continue to perform experiments to detect the small forces that are predicted to be produced by this effect. So far some groups claim to have detected forces at the levels predicted and other groups have detected forces at much greater than predicted levels or nothing at all. To date there has been no announcement conclusively confirming proof for the existence of this effect or ruling it out.[6]
  • In 2004, Paul March of Lockheed Martin Space Operations, who started working in this research field in 1998, presented a successful replication of Woodward's previous experiments at STAIF.[75]
  • In 2004, John G. Cramer and coworkers of the University of Washington reported for NASA that they had made an experiment to test Woodward's hypothesis, but that results were inconclusive because their setup was undergoing strong electrical interference which would have masked the effects of the test if it had been conducted.[76]
  • In 2006, Paul March and Andrew Palfreyman reported experimental results exceeding Woodward's predictions by one to two orders of magnitude. Items used for this experiment are shown in the photograph above.[62]
  • In 2006, Martin Tajmar, Nembo Buldrini, Klaus Marhold and Bernhard Seifert, researchers of the then Austrian Research Centers (now the Austrian Institute of Technology) reported results of a study of the effect using a very sensitive thrust balance. The researchers recommended further tests.[77]
  • In 2010, Ricardo Marini and Eugenio Galian of the IUA (same Argentine institute as Hector Brito's) replicated previous experiments, but their results were negative and the measured effects declared as originating from spurious electromagnetic interferences only.[78]
  • In 2011, Harold "Sonny" White of the NASA Eagleworks laboratory and his team announced that they were rerunning devices from Paul March's 2006 experiment[62] using force sensors with improved sensitivity.[79]
  • In 2014, Nembo Buldrini tested a Woodward device on a thrust balance in a high vacuum at the FOTEC research center in Austria, confirming qualitatively the presence of the effect and reducing the number of possible false positives; although recommending more investigation due to the relative small magnitude of the effect.[82]


Inertial frames

All inertial frames are in a state of constant, rectilinear motion with respect to one another; they are not accelerating in the sense that an accelerometer at rest in one would detect zero acceleration. Despite their ubiquitous nature, inertial frames are still not fully understood. That they exist is certain, but what causes them to exist – and whether these sources could constitute reaction-media – are still unknown. Marc Millis, of the NASA Breakthrough Propulsion Physics Program, stated " For example, the notion of thrusting without propellant evokes objections of violating conservation of momentum. This, in turn, suggests that space drive research must address conservation of momentum. From there it is found that many relevant unknowns still linger regarding the source of the inertial frames against which conservation is referenced. Therefore, research should revisit the unfinished physics of inertial frames, but in the context of propulsive interactions. "[83] Mach's principle is generally defined within general relativity as "the local inertia frame is completely determined by the dynamic fields in the universe." Rovelli evaluated a number of versions of "Mach's principle" that exist in the literature. Some are partially correct and some have been dismissed as incorrect.[19]

Conservation of momentum

Momentum is defined as mass times velocity. Conservation of momentum applies to velocity terms, usually described in a two dimensional plane with a vector diagram. A vector representing velocity has both direction and magnitude. A requirement for determining conservation of momentum is that an inertial frame or frame of reference for the observer be fixed. Inertial frames are well defined for constant velocity and conservation of momentum holds for all such frames. During acceleration or a change in acceleration, conservation of momentum applies to the local inertial frame (LIF) of instantaneous velocity, not to the proper acceleration a.k.a. coordinate acceleration), as measured by the accelerated observer.[citation needed]
A challenge to the mathematical foundations of Woodward's hypothesis were raised in a paper published by the Oak Ridge National Laboratory in 2001. In the paper, John Whealton noted that the experimental results of Oak Ridge scientists can be explained in terms of force contributions due to time-varying thermal expansion, and stated that a laboratory demonstration produced 100 times the Woodward effect without resorting to non-Newtonian explanations.[84] In response, Woodward published a criticism of Whealton's math and understanding of the physics involved, and built an experiment attempting to demonstrate the flaw.[85]
A rate of change in momentum represents a force, whereby F = ma. Whealton et al. use the technical definition, F=d(mv)/dt, which can be expanded to F=m dv/dt + dm/dt v. This second term has both delta mass and v, which is measured instantaneously; this term will, in general, cancel out the force from the inertial response terms predicted by Woodward. Woodward argued that the dm/dt v term does not represent a physical force on the device, because it vanishes in a frame where the device is momentarily stationary.[43]
In an appendix to his thesis, Mahood argues that the unexpectedly small magnitude of the results in his experiments are a confirmation of the cancellation predicted by Whealton; the results are instead due to higher-order mass transients which are not exactly cancelled.[70] Mahood would later describe this argument as "one of the very few things I've done in my life that I actually regret".[86]
Although the momentum and energy exchange with distant matter guarantees global conservation of energy and momentum, this field exchange is supplied at no material cost, unlike the case with conventional fuels. For this reason, when the field exchange is ignored, a propellantless thruster behaves locally like a free energy device. This is immediately apparent from basic Newtonian analysis: if constant power produces constant thrust, then input energy is linear with time and output (kinetic) energy is quadratic with time. Thus there exists a break-even time (or distance or velocity) of operation, above which more energy is output than is input. The longer it is allowed to accelerate, the more pronounced will this effect become, as simple Newtonian physics predicts.
Considering those conservation issues, a Mach effect thruster relies on Mach's principle, hence it is not an electrical to kinetic transducer, i.e. it does not convert electric energy to kinetic energy. Rather, a Mach Effect Thruster is a gravinertial transistor that controls the flow of gravinertial flux, in and out of the active mass of the thruster. The primary power into the thruster is contained in the flux of the gravitational field, not the electricity that powers the device. Failing to account for this flux, is much the same as failing to account for the wind on a sail.[87] Mach effects are relativistic by nature, and considering a spaceship accelerating with a Mach effect thruster, the propellant is not accelerating with the ship, so the situation should be treated as an accelerating and therefore non-inertial reference frame, where F does not equal ma. Keith H. Wanser, professor of physics at California State University, Fullerton, published a paper in 2013 concerning the conservation issues of Mach effect thrusters.[88]

Quantum mechanics

In 2009, Harold "Sonny" White of NASA proposed the Quantum Vacuum Fluctuation (QVF) conjecture, a non-relativistic hypothesis based on quantum mechanics to produce momentum fluxes even in empty outer space.[89] Where Sciama's gravinertial field of Wheeler–Feynman absorber theory is used in the Woodward effect, the White conjecture replaces the Sciama gravinertial field with the quantum electrodynamic vacuum field. The local reactive forces are generated and conveyed by momentum fluxes created in the QED vacuum field by the same process used to create momentum fluxes in the gravinertial field. White uses MHD plasma rules to quantify this local momentum interaction where in comparison Woodward applies condensed matter physics.[79]
Based on the White conjecture, the proposed theoretical device is called a quantum vacuum plasma thruster (QVPT) or Q-thruster. No experiments have been performed to date. Unlike a Mach effect thruster instantaneously exchanging momentum with the distant cosmic matter through the advanced/retarded waves (Wheeler–Feynman absorber theory) of the radiative gravinertial field, Sonny's "Q-thruster" would appear to violate momentum conservation, for the thrust would be produced by pushing off virtual "Q" particle/antiparticle pairs that would annihilate after they have been pushed on. However, it would not necessarily violate the law of conservation of energy, as it requires an electric current to function, much like any "standard" MHD thruster, and cannot produce more kinetic energy than its equivalent net energy input.[citation needed]
Woodward and Fearn showed why the amount of electron-positron virtual pairs of the quantum vacuum, used by White as a virtual plasma propellant, cannot account for thrusts in any isolated, closed electromagnetic system such as the QVPT or the EmDrive.[90]

Media reaction

Woodward's claims in his papers and in space technology conference press releases of a potential breakthrough technology for spaceflight have generated interest in the popular press[5][20] and university news[91][92] as well as the space news media.[6][93][94][95] Woodward also gave a video interview[96] for the TV show Ancient Aliens, season 6, episode 12.[97] However doubters do exist.[6]