Friday, September 18, 2015

RSA & Diffie-Hellman proves P does not equal NP

Vinay Deolalikar, a mathematician who works for HP Labs, claims to have proven that P is not equal to NP. The problem is the greatest unsolved problem in theoretical computer science and is one of seven problems in which the Clay Mathematics Institute has offered million dollar prizes to the solutions. The question of whether P equals NP essentially asks whether there exist problems which take a long time to solve but whose solutions can be checked quickly. More formally, a problem is said to be in P if there is a program for a Turing machine, an ideal theoretical computer with unbounded amounts of memory, such that running instances of the problem through the program will always answer the question in polynomial time — time always bounded by some fixed polynomial power of the length of the input. A problem is said to be in NP, if the problem can be solved in polynomial time when instead of being run on a Turing machine, it is run on a non-deterministic Turing machine, which is like a Turing machine but is able to make copies of itself to try different approaches to the problem simultaneously. Mathematicians have long believed thatResearcher claims solution to P vs NP math problem Wednesday, August 11, 2010 Related news Mathematics on Wikinews Nuvola apps edu mathematics blue-p.svg Australian physicists generate tractor beam on water Norwegian Academy of Science and Letters awards Belgian mathematician Pierre Deligne with Abel prize of 2013 Record size 17.4 million-digit prime found Mathematician Benoît Mandelbrot dies aged 85 Researcher claims solution to P vs NP math problem Stanford physicists print smallest-ever letters 'SU' at subatomic level of 1.5 nanometres tall Mathematician Martin Taylor awarded knighthood Collaborate! Newsroom Style Guide - how to write Content Guide - what to write Vinay Deolalikar, a mathematician who works for HP Labs, claims to have proven that P is not equal to NP. The problem is the greatest unsolved problem in theoretical computer science and is one of seven problems in which the Clay Mathematics Institute has offered million dollar prizes to the solutions. The question of whether P equals NP essentially asks whether there exist problems which take a long time to solve but whose solutions can be checked quickly. More formally, a problem is said to be in P if there is a program for a Turing machine, an ideal theoretical computer with unbounded amounts of memory, such that running instances of the problem through the program will always answer the question in polynomial time — time always bounded by some fixed polynomial power of the length of the input. A problem is said to be in NP, if the problem can be solved in polynomial time when instead of being run on a Turing machine, it is run on a non-deterministic Turing machine, which is like a Turing machine but is able to make copies of itself to try different approaches to the problem simultaneously. Mathematicians have long believed that P does not equal NP, and the question has many practical implications. Much of modern cryptography, such as the RSA algorithm and the Diffie-Hellman algorithm, rests on certain problems, such as factoring integers, being in NP and not in P. If it turned out that P=NP, these methods would not work but many now difficult problems would likely be easy to solve. If P does not equal NP then many natural, practical problems such as the traveling salesman problem are intrinsically difficult. In 2000, the Clay Foundation listed the "Clay Millenium Problems," seven mathematical problems each of which they would offer a million dollars for a correct solution. One of these problems was whether P equaled NP. Another of these seven, the Poincaré conjecture, was solved in 2002 by Grigori Perelman who first made headlines for solving the problem and then made them again months later for refusing to take the prize money. On August 7, mathematician Greg Baker noted on his blog that he had seen a draft of a claimed proof by Deolalikar although among experts a draft had apparently been circulating for a few days. Deolalikar's proof works by connecting certain ideas in computer science and finite model theory to ideas in statistical mechanics. The proof works by showing that if certain problems known to be in NP were also in P then those problems would have impossible statistical properties. Computer scientists and mathematicians have expressed a variety of opinions about Deolalikar's proof, ranging from guarded optimism to near certainty that the proof is incorrect. Scott Aaronson of the Massachusetts Institute of Technology has expressed his pessimism by stating that he will give $200,000 of his own money to Deolalikar if the proof turns out to be valid. Others have raised specific technical issues with the proof but noted that the proof attempt presented interesting new techniques that might be relevant to computer science whether or not the proof turns out to be correct. Richard Lipton, a professor of computer science at Georgia Tech, has said that "the author certainly shows awareness of the relevant obstacles and command of literature supporting his arguments." Lipton has listed four central objections to the proof, none of which are necessarily fatal but may require more work to address. On August 11, 2010, Lipton reported that consensus of the reviewers was best summarized by mathematician Terence Tao, who expressed the view that Deolalikar's paper probably did not give a proof that P!=NP even after major changes, unless substantial new ideas are added., and the question has many practical implications. Much of modern cryptography, such as the RSA algorithm and the Diffie-Hellman algorithm, rests on certain problems, such as factoring integers, being in NP and not in P. If it turned out that P=NP, these methods would not work but many now difficult problems would likely be easy to solve. If P does not equal NP then many natural, practical problems such as the traveling salesman problem are intrinsically difficult. In 2000, the Clay Foundation listed the "Clay Millenium Problems," seven mathematical problems each of which they would offer a million dollars for a correct solution. One of these problems was whether P equaled NP. Another of these seven, the Poincaré conjecture, was solved in 2002 by Grigori Perelman who first made headlines for solving the problem and then made them again months later for refusing to take the prize money. On August 7, mathematician Greg Baker noted on his blog that he had seen a draft of a claimed proof by Deolalikar although among experts a draft had apparently been circulating for a few days. Deolalikar's proof works by connecting certain ideas in computer science and finite model theory to ideas in statistical mechanics. The proof works by showing that if certain problems known to be in NP were also in P then those problems would have impossible statistical properties. Computer scientists and mathematicians have expressed a variety of opinions about Deolalikar's proof, ranging from guarded optimism to near certainty that the proof is incorrect. Scott Aaronson of the Massachusetts Institute of Technology has expressed his pessimism by stating that he will give $200,000 of his own money to Deolalikar if the proof turns out to be valid. Others have raised specific technical issues with the proof but noted that the proof attempt presented interesting new techniques that might be relevant to computer science whether or not the proof turns out to be correct. Richard Lipton, a professor of computer science at Georgia Tech, has said that "the author certainly shows awareness of the relevant obstacles and command of literature supporting his arguments." Lipton has listed four central objections to the proof, none of which are necessarily fatal but may require more work to address. On August 11, 2010, Lipton reported that consensus of the reviewers was best summarized by mathematician Terence Tao, who expressed the view that Deolalikar's paper probably did not give a proof that P!=NP even after major changes, unless substantial new ideas are added.

Yang–Mills existence and mass gap solved

https://archive.org/download/Yang-MillsProblem/0903.4727v4.pdf

Tuesday, September 15, 2015

I hate Gamestop, Gamestop really is agnorant

My parents came to my house and said I have 300 games over what I should have. That doesn't bother me.  What bothers me is that Game Stop lies about console sales.  The Internet says Wii U is dead, and Game Stop clerks says Wii U is increasing sales and thier best month was August. I hope Game Stop goes out of business.  They lie, their used games are too expensive, and the Internet bandwidth caps prevent me from downloading online over 5 gb).   So of course I walked out without buying any videogame.  I love The Videogame Critic.

The video game industry is maturing—fast. The average age of a “gamer,” that is, someone who plays video games on a regular basis, is now 37, according to the Entertainment Software Association, an industry trade group. That’s up from last year, when the average came in at around 35 years old.
Surprised? Don’t be. After all, these “greying gamers” were the first generation to grow up with video games as children. As they’ve aged, many apparently kept on playing, delving even deeper into the gaming abyss through consoles, PCs, and now their mobile devices.
If you care to see this older, dare I say, more “refined” sort of gamer, then make your way out west this week to this year’s E3 Electronic Entertainment Expo in Los Angeles, the gaming industry’s biggest trade show. There you will see plenty of Gen X’ers and Gen Y’ers (is that still a thing?) milling about, many sporting unkempt beards and ironic t-shirts like it’s 2007. Apparently, shaving and adhering to current fashion norms takes way too much time and effort, time away from Assassin’s Creed 32, or some other “new” iteration of a once popular gaming title.
But this older generation of gamers is both a blessing and a curse for the industry. It is a blessing in that as they age, their pockets get deeper, so they potentially have more money to spend on their hobby (assuming they don’t get married and have kids, which, unsurprisingly, many don’t). But it is also a curse, because the industry seems stuck in a time warp.
Simply put, content makers, many of whom are greying gamers themselves, have become lazy. They have failed to innovate on both the hardware and content side of the business, alienating potential young consumers while angering older gamers who crave something newer than just another Call of Duty. Each new game “unveiled” this week in Los Angeles will almost undoubtedly be a mashup of characters and scenes derived from popular movie franchises that debuted in the late 1990s and early 2000s, such as The Matrix, Starship Troopers, The Terminator, Sailor Moon, and The Hobbit, with a dash of The Fast and the Furious thrown in for good measure. It is getting old.
The result of all this nostalgic and creative laziness is a shrinking market. Video game sales in the U.S. actually peaked in 2010, at $17 billion, and have fallen progressively ever since, hitting just $15.4 billion in 2014, according to NPD Group. This is expected to continue unless the industry finds a new life source, and fast.
To be sure, this drop-off isn’t just because younger people don’t watch TV or because they are Snapchatting or whatever all day on their mobile phones. They aren’t interested because the content doesn’t speak to them. While mobile is the “fastest” growing segment of the industry, it is inherently backwards looking given the technological and ergonomic constraints of the market. Either the games have to be terribly simple (and forgettable), like Candy Crush, or they have to be stripped down (worse) versions of console games.
But the industry feels compelled to “capture” this market nonetheless, diverting precious resources that would be better spent advancing the core gaming market. For example, on Sunday evening, Bethesda Game Studios, a major game developer, showed off its newest iteration of its popular “Fallout” series, Fallout 4, but also revealed a new Fallout mobile game, ostensibly to “capture” the yet unnamed generation of kids today. Todd Howard, director of Bethesda Gaming, said the mobile Fallout version was “inspired by games we love going back 30 years,” and that gamers will see inspirations from older retro games like X-com, SimCity, and FTL, which are, “games we really really like.” I think that says it all.
As the show rolls on this week, it would be nice to see some real innovation in the core gaming product, as well as some fresh content aimed at a younger subset of the population. Virtual reality (VR) has been talked about for years, but we have yet to see it come to market. Oculus now says its consumer VR product will be out by the first quarter of next year, and Valve and HTC say their VR console will be ready for this year’s holiday season.
If true, then the industry needs to start building content for this new and potentially “game changing” platform as quickly as possible. And no, that doesn’t mean shoehorning current games to just simply “work” on a VR platform. It means building new games from the ground up, specifically tailored for the virtual reality experience. If the content fails to wow consumers, then they won’t pay hundreds of dollars to acquire a VR system, which means VR will die a quick death, just as it did in the 1990s.
So, given all that, what’s on tap for Monday at E3? Microsoft kicks things off at noon (Eastern Time), followed by EA and Ubisoft later in the day. Sony will have its flashy press conference this evening around 8:30pm ET while Nintendo will have theirs tomorrow at noon, which is actually official first day of the conference. Fortune will have people on the ground, so look out for news briefs throughout the day.


Sunday, September 06, 2015

High Energy Liquid Laser Area Defense System field testing

Enemy surface-to-air threats to manned and unmanned aircraft have become increasingly sophisticated, creating a need for rapid and effective response to this growing category of threats. High power lasers can provide a solution to this challenge, as they harness the speed and power of light to counter multiple threats. Laser weapon systems provide additional capability for offensive missions as well—adding precise targeting with low probability of collateral damage. For consideration as a weapon system on today’s air assets though, these laser weapon systems must be lighter and more compact than the state-of-the-art has produced. The goal of the HELLADS program is to develop a 150 kilowatt (kW) laser weapon system that is ten times smaller and lighter than current lasers of similar power, enabling integration onto tactical aircraft to defend against and defeat ground threats. With a weight goal of less than five kilograms per kilowatt, and volume of three cubic meters for the laser system, HELLADS seeks to enable high-energy lasers to be integrated onto tactical aircraft, significantly increasing engagement ranges compared to ground-based systems. In May 2015, HELLADS demonstrated sufficient laser power and beam quality to advance to a series of field tests. The achievement of government acceptance for field trials marked the end of the program’s laboratory development phase and the beginning of a new and challenging set of tests against rockets, mortars, vehicles and surrogate surface-to-air missiles at White Sands Missile Range, New Mexico. Ground-based field testing of the HELLADS laser is expected to begin in summer 2015 as an effort jointly funded by DARPA and the Air Force Research Laboratory. Following the field-testing phase, the goal is to make the system available to the military Services for further refinement, testing or transition to operational use.