Monday, October 30, 2023

Would You Rather Be Governed by Totalitarians or by Fools?

 

Washington, D.C., is the major problem in America today. That city is destroying the country because it is providing absolutely no leadership morally, fiscally, politically, or in any other measure true leaders need. A brief description of that cesspool follows, and, of course, it has spread its tentacles throughout the nation.

We have two political parties. One is a totalitarian party. The Democratic Party consists entirely of people who crave power, who want to control you, dominate your life, who simply believe that government is the source of all wisdom and authority, and that they are the ultimate possessors of the former and thus should have all of the latter. The Democratic Party is pure evil. There is no such thing as a “good Democrat,” not in political office. That is a contradiction of terms, an oxymoron. People who believe in murdering unborn babies, mutilating children for the pleasure of sexual perverts, who lie incessantly about things such as climate change and COVID vaccines, who allow men to be “women” and visa-versa; racists who permit crime to run rampant in their streets and let countless possible terrorists freely into the country to threaten the people they are sworn to protect and serve; degenerates who enact policies that are causing countless millions of Americans to struggle financially and live on the edge of poverty due to higher fuel, food, and housing costs, and who ultimately want to tell everyone what kind of vehicle they can drive or food they can eat—such people are not “good” by any definition of that word except their own warped, subjective standard. And they crave total power to enact the agenda and measures above, forcing you to submit. The only “freedom” Democrats believe in is what they will ALLOW you to have, and, of course, that is not “freedom” at all, as I can tell you from living in China. While its official name is “Democratic Party,” its true name is “Totalitarian Party,” because that, not democracy, is what they believe in.

On the other side of the aisle, there is the goofball Republican Party. We have come to facetiously call that party the “Stupid Party,” and the last week’s activities in Washington, D.C., is a perfect example of why. I wasn’t disappointed that Kevin McCarthy was removed as speaker of the House. But neither am I surprised at the current bungling, in-fighting, spite, and downright—stupidity—of the Republicans in the House as they flounder in trying to elect another speaker. The Totalitarian Party, in true totalitarian fashion, has voted 100 percent against any candidate (Jim Jordan especially) that the Republicans have tried to elect. But, of course, the Democrats couldn’t stop the Republicans from electing whoever they wanted to, if the Republicans would vote in solidarity. But the Republicans don’t do that. In one sense, that is not a bad thing; debate is good and healthy, something the Democrats never allow any more. But the perception of bumbling stupidity is NOT a good thing, and that is the major perception Republicans are now leaving with the American people.

I don’t believe every Republican in Congress is a fool. I think there are some good, decent, moral people in the Republican Party, even in Washington, D.C., who truly want to do the right thing for the American people, and defend the natural rights of life, liberty, and the pursuit of happiness that our Founders wrote a Constitution to protect. There are some non-fools in the Republican Party. Give me a few minutes and I might even be able to name one or two for you.
Advertisement

But that small group—whoever they are—certainly does not constitute the majority of the Republican Party. The Ronna McDaniels, Mitch McConnells, Lindsey Grahams, Mitt Romneys, Nikki Haleys, and other such Swamp Creatures Of Their Ilk are far, far too numerous, and continue to lead the party to defeat after disaster after futile, bungling failure. The Republican Party is simply not good at “government,” and that is not altogether a bad thing, either. We don’t want too much of that stuff. Government is ALL the Totalitarian Party wants, and they intend to put it before you and cram it down your throat. The Republicans don’t even know what is on their plate.

So, this leads me to the question in the title of this column: would you rather be governed by totalitarians or by fools? Sadly, that seems to be our only choice at the moment. Given a choice between those two, I will take fools over murderers any day of the week. Totalitarianism gives us no freedom. Stupidity is not a good thing, but unfortunately, true freedom does give people the right to be ignorant and act foolishly. Such is the cost of freedom—as long as it lasts.

The answer to this conundrum is in proper education. As Thomas Jefferson wrote, “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.” Because the American education system is totally in the toilet now, that “state of civilization” Mr. Jefferson spoke of is about to disappear. Barbarism, almost by definition, demands totalitarianism to control it. The Democrats know that, which is exactly why they are doing everything they can to turn barbarians loose in the country.
Advertisement

We can only vote for the people who run for office. And, in recent decades, the Republicans who have been on the ballot have tended to be…fools. They have succumbed to The Swamp. We need to find God-fearing, truly educated, intelligent, decent, moral, and courageous people to run for political office. We need those Americans in the majority in Washington, D.C. and state governments. I don’t know how to find them. But that’s what we need.

Otherwise, we will be governed either by totalitarians or by fools.

Why I started collecting videogames to defeat depression

 

I calmed down a bit and actually made some goals…instead of collecting every game out there…I’m collecting all NES/SNES/N64 games…both CIB and loose with a manual. It’s still a pretty crazy goal but…it’s easier then buying every game for every system.

But to dig into it…it’s a mix of nostalgia and obsession. I need something to focus and obsess on. If I don’t have something…I tend to get unfocused and just…end up in a not good place. I’m struggling with depression…so to have something to keep my mind off of my dark thoughts is useful. Gives me a goal to keep getting out of bed, keep struggling through the days, just so I can complete my collection and finally be satisfied. Of course I also play games…gaming helps my depression A LOT. It sucks me into another world…so much better then my own. I can forget my troubles for a few hours a day. Or I can make my own character and live my life how I always wanted. Older games can put me back to my childhood as well…when I was happier…

That’s the focus of my game room…making into something that resembles my childhood. So I can step into my game room, shut the door behind me and just get sucked back in time. I’ve done a lot of thinking into this. I’m filling it with toys and stuff from my childhood (not just video game related stuff), the furniture in that room is going to be from the era, the TV is going to be a CRT, the carpet is going to be arcade carpeting (which glows in black light), going to have glow in the dark stars up on the ceiling…heck I’m even going so far as to only play 80s music in there via cassettes and an old boom box (since I can’t rely on any local station to just play 80s music).

It’s fun. There’s a thrill when you get a really good deal, or find a really rare game in a $2 bin at a flea market or a thrift store. Your collection is more than just a bunch of games; it’s a collection of bundles that you bought with stories behind them. I think a lot of collectors are really collecting memories- like games they loved or really wanted. That’s definitely a big part of it.

There are a lot of collectors that go nuts cleaning their games but I like the character. I love finding old receipts or a handwritten note in a game with cheat codes, combos, etc or something, or to see a sticker from some towns local video rental store that’s been out of business for a decade. I like to leave those old Kmart stickers or sharpied names on there. I don’t know, maybe I’m weird with this one.

Also, most people have an arrangement to some console. When you collect and you have games for every console, it puts a smile on pretty much anyone’s face. People want to play those games, which makes picking up something like Mario Party or Fuzion Frenzy that much more fun.

Lastly, there’s nothing like a big, physical, expensive collection. Personally, I only buy games if I’m getting a deal. Every dollar I put in is at least doubled and grows over time. It feels good looking at a bookshelf filled with money. And it looks cool, too

There is something about the NES that first got me into video games, I like the look of the system, I like the music, I like the memories, I like the games, I like how they look displayed. There was a certain magic to it, and nostalgia plays a huge part. I do have Nostalgia for other systems, but nothing like the NES…not when it comes down to it. I played NES from like 7-12, which was basically the core of my childhood, in many ways, it’s the thing i did MOST in my childhood from like 1987-1992 or so. Playing NES helps me revisit many of these times, but I do enjoy it also. Many of these good times are with people who are no longer living, and it helps me remember them, b/c NES Was such a huge part of my childhood there.

I think collections are better when they are rarer, higher quality, more organized, and have some type of goal or theme in mind, they look better as a set or idea as a whole. I’d rather have a more well rounded NES collection than a bunch of other stuff, having less space space and a limited budget helps to force this aspect.

This is just my thinking though. I look at massive collections of tons of consoles and it doesn’t do anything for me. Do I enjoy them? Sure. But I don’t want to collect for them. I also know my tendencies, and they can get out of control if not kept in check with rules, i.e. too much stuff, too much money spent, etc.

I also collect vintage cameras, vinyl, and books, these collections started in my youth or from stuff from my family

The 25 greatest video game consoles – ranked!

 

  1. 3DO (1993)

The second true 32-bit machine after the FM Towns Marty, the 3DO was available via a unique business model: the 3DO Company (formed by Electronic Arts founder Trip Hawkins) licensed its technical specifications to third-party manufacturers such as Sanyo and Panasonic, which then built their own versions. Unfortunately, this approach made the hardware hugely expensive ($699 at launch – equivalent to $1,267 or £990 today) compared with rival consoles that could be sold at a loss by their manufacturers. There were some excellent titles, including the original Need for Speed and the strategy-shooter Return Fire, but the PlayStation killed it stone dead.

  1. Atari Jaguar (1993)

The veteran company’s final console boasted a powerful yet jumbled architecture based around two silicon chipsets (named Tom and Jerry) and a Motorola 68000 processor. Some say it was difficult to code for, it lacked a broad games catalogue and at $249, it was also very expensive. Now best known for the trio of excellent shooters Tempest 2000, Doom and Alien vs Predator, it remains an intriguing technical oddity.

  1. Sega Master System (1985)

Throughout the early 1980s Sega made several attempts to transfer its arcade expertise to the home console market – the Master System was the most successful. More powerful and with a fuller colour palette than the mighty Nintendo Entertainment System (NES), the eight-bit machine boasted decent arcade conversions, but is best remembered for its scrolling platformers, including Alex Kidd in Miracle World, Wonder Boy, Psycho Fox and an expertly reduced version of Sonic the Hedgehog.
Nintendo’s Switch Lite

  1. Intellivision (1979)

With its brown and gold chassis, wood effect lining and retro-futuristic controllers, Mattel’s Intellivision screamed “It’s the 1970s!” from every angle. But, developed a year after the release of the Atari VCS, it was a much more sophisticated machine thanks to a 16-bit central processor and generous 16-colour palette. Famous for its pioneering use of licensed sports titles and its convincing arcade ports (Burger Time, Donkey Kong Jr, Bump N Jump …) there were also intriguing original titles such as the weird operating-theatre sim Microsurgeon and B-17 Bomber, which came with a voice synthesiser for, ahem, “realistic” speech effects.

  1. PC Engine (1987)

In the 1980s most consoles resembled toys – the PC Engine, with its futuristic white chassis and cool mini-cartridges (or HuCards), looked like something out of Akira. Designed by electronics giant NEC and game developer Hudson Soft, the console contained twin 16-bit graphics chips that brought a singular aesthetic quality to arcade conversions such as R-Type, Splatterhouse and Ninja Spirit. Released later in the US as the Turbografx-16, it’s a genuine cult classic.

  1. Sega Saturn (1994)

Hitting Japanese shelves a fortnight before PlayStation, Sega’s 32-bit machine would be forever defined by its failed rivalry with Sony. Its fragmented internal architecture was built around Sega’s cutting-edge arcade machine technology, but developers needed expert knowledge of assembly language to wrestle anything out of it. Still, Sega’s studios triumphed with Virtua Fighter, Nights into Dreams and Sega Rally, while its plethora of stunning 2D shooters and fighting games thrilled hardcore gamers.

  1. Colecovision (1982)

With a Z80 processor three times more powerful than the Atari VCS and a huge 16KB of video ram, the Colecovision was a significant technological leap forward, allowing smooth animation and colourful visuals. Among its 125 games there were interesting original titles such as scrolling adventure Tarzan and Fortune Builder, an early SimCity predecessor. Later expansion modules let owners play Atari game carts and use a steering wheel controller. The machine is best known for excellent arcade conversions including Gorf, Zaxxon and Donkey Kong. Nintendo was so impressed that its head of R&D, Masayuki Uemura, used the Colecovision as inspiration for the NES.
The late Ralph Baer and his prototype of the first games console in 2009.
The late Ralph Baer in an image from 2009, with his 1960s prototype of the first games console. Photograph: Jens Wolf/dpa/AP

  1. Magnavox Odyssey (1972)

The first ever games console was developed by engineer Ralph Baer while working for defence contractor Sanders Associates under the attractive code name Brown Box. It consisted of a white and, yes, brown box containing just 40 transistors and 40 diodes, with wires connecting to the TV and two blocky controllers. The games, basically variants of Pong, had no sound and colour was achieved by placing plastic overlays on the screen – but the profound concept of interacting with graphics on your TV began here.

  1. SNK Neo Geo (1990)

Like Sega, SNK was a 1980s arcade giant with a desire to infiltrate the home console market, but its approach was much more ambitious. It set out to build a machine using exactly the same technology as its coin-op hits. This resulted in astonishing home versions of Fatal Fury, Art of Fighting and The King of Fighters; the only catch was the Neo Geo cost three times as much as its competitors and the 330-megabit game carts cost up to $200 each. No wonder it’s known as the Rolls-Royce of game consoles.

  1. Atari VCS/2600 (1977)

For a while, in the late 1970s, Atari was video games. After the success of its home Pong console in 1975, the company’s designers spent a year constructing a microprocessor-based system that could play games based on rom cartridges. During Christmas 1977, 400,000 machines hit US shelves and sold out almost instantly. The look of the machine, with its black fascia and wood panelling, and its simple eight-directional joystick, set the design ethos of the industry, while its games, with their beautifully illustrated boxes, were design classics in both form and function. Infiltrating pop culture via movies as diverse as Airplane! and Bladerunner, the Atari was more than a console – and even its failures, especially the botched film tie-in ET, became the stuff of legend.

  1. Nintendo GameCube (2001)

Beside the muscular design and technical potency of the PS2 and Xbox, with their internet connectivity and DVD support (the Xbox via an add-on), the GameCube looked like a giant Lego brick with a handle and its IBM PowerPC processor was underpowered compared to Microsoft’s machine. But Nintendo wanted a fun, characterful console that was cheap and easy to develop for – that’s what we got. Plus, Metroid Prime, Legend of Zelda: Wind Waker, Resident Evil 4 and Super Mario Sunshine competed with anything on those other machines.

  1. PlayStation 3 (2006)

Beset by difficulties and delays regarding its ambitious Cell processor and the inclusion of a Blu-ray drive, the PS3 was an intimidating project from the start. Its online multiplayer service was generously subscription-free yet inferior to the Xbox 360 offering and developers found it tough to work with the array of multiple synergistic processing units (SPUs). And yet this is where a new form of lush cinematic gaming experience flourished: Uncharted 2, God of War III, Demon’s Souls, Heavy Rain and Journey all pointed to a future of imaginative, challenging storytelling within rich immersive worlds. Their lessons are still being taught and learned.

  1. Nintendo 64 (1996)

Developed in conjunction with supercomputer specialist Silicon Graphics Inc and originally given the not-at-all hubristic codename Project Reality, the N64 was a contradictory beast – backwardly sticking with carts instead of embracing CD-roms but innovative in its use of an analogue joystick to allow accurate 3D movement. This, of course, led to Super Mario 64, the defining game of the era, but the console saw many other classics, including GoldenEye, Banjo-Kazooie, Wave Race 64 and Legend of Zelda: Ocarina of Time.

  1. Xbox One (2013)

Built more as a nicely boxed PC than a traditional console, the Xbox One boasts multicore AMD processors, HDR (and 4K video) compatibility, cloud storage, game streaming and a host of multimedia options. And just as the machine was beginning to age, its S and X iterations came along and boosted the specs. Alongside beautiful versions of multiplatform hits Witcher 3, Assassin’s Creed Origins and Fallout 4, it has also brought us Forza Horizon 4, Sea of Thieves, Halo 5 and Ori and the Blind Forest. State of the art, in many ways.

  1. Xbox (2001)

Teased by Bill Gates himself at the 2000 Game Developers’ Conference, Xbox was conceived by the team behind Direct X (Microsoft’s middleware for PC game developers) as a technological Trojan horse to get the company’s products into the living room. From those slightly Orwellian foundations came a robust, powerful and exciting machine, its back catalogue of more than 600 games boasting one of the greatest console first-person shooters ever in Halo, as well as gritty brawler Ninja Gaiden, surreal adventure Psychonauts and Star Wars epic Knights of the Old Republic. From the beginning, Xbox understood the rising importance of online play, with its integrated ethernet port and robust Xbox Live infrastructure. The most significant US-designed console since the Atari VCS.

  1. Nintendo Wii (2006)

Plenty of games industry pundits saw the tech specs for the Wii in 2005 and wrote it off as “two Game Cubes taped together”. What they hadn’t figured on was the unique joypad, which used motion controls providing barrier-free access to simple, perfectly designed games such as Wii Sports, Wii Play and Wii Fit. Suddenly, whole families could compete together, from the youngest to the oldest – and Nintendo sold more than 100m units as a result. A victory for utilitarian design over technological obsession.

  1. Sega Dreamcast (1998)

With its built-in modem and hugely innovative controllers (complete with removable memory cards doubling as handheld games machines), the Dreamcast was a visionary piece of hardware, backed up by Sega’s astonishingly creative first-party development teams. The machine saw an array of idiosyncratic titles – Jet Set Radio, Shenmue, Seaman, Rez, Phantasy Star Online – that either invented new genres or utterly revolutionised old ones. But the lack of support from western developers and the sheer might of the PS2 ensured that its life was as brief as it was beautiful.
Nintendo employees and the GameCube in 2001.
Nintendo employees and the GameCube in 2001. Photograph: Yoshikazu Tsuno/EPA

  1. Nintendo Switch (2017)

The follow-up to the interesting but flawed Wii U replicated that console’s two-screen approach, but made its built-in HD display truly portable, adding two motion-sensitive controllers into the mix. The Switch is a brilliantly flexible home console, seamlessly switching local multiplayer between the living room and the world at large, somehow combining the genius of the Wii and the Game Boy. Its best games – Legend of Zelda: Breath of the Wild, Animal Crossing: New Horizons, Super Mario Odyssey, Mario Kart 8 Deluxe, Super Mario Maker – exploit the delightful peculiarities of this machine perfectly.

  1. PlayStation (1994)

By 1993, Sony had failed to enter the console industry through collaborations with Sega and Nintendo, so the company’s hardware genius Ken Kutaragi thought, screw it, let’s make our own machine. He designed an architecture that was powerful yet easy to develop for and focused on pushing 3D shapes around the screen as efficiently as possible. Sony then solved its lack of development experience by purchasing UK studio Psygnosis and inking an exclusive deal with Japanese arcade veteran Namco. The resulting console ruled the 1990s, thrilling time-rich twentysomethings with titles such as Tekken, Gran Turismo and Tony Hawk’s Pro Skater. This machine changed everything.

  1. Famicom/Nintendo Entertainment System (1983)

It’s an oft-repeated fact, but always one worth remembering: throughout the 1980s in North America, you didn’t play “video games”, you played Nintendo. Like Hoover and Aspirin before it, the brand was so synonymous with the activity that it became genericised. This was down to the NES, a boxy, dated entry into the console market, which came with funny flat little joypads (heavily inspired by Nintendo’s successful Game & Watch handheld devices) and chunky carts. But the games, oh the games. Super Mario Bros, Legend of Zelda, Contra, Mega Man, Final Fantasy, Excitebike … This was where Nintendo’s (and specifically Shigeru Miyamoto’s) design genius originally flourished and where we learned the company’s maxim that old, well-used technology could be reformulated to realise amazing things.

  1. PlayStation 4 (2013)

Based around similar tech as the Xbox One and launched almost simultaneously, the PS4 saw Sony concentrating on games rather than multimedia functionality, immediately winning the PR war against Microsoft. With its excellent controller, vastly improved online infrastructure and seamless sharing and streaming functions, it’s an innovative system, but where it really wins is in its embarrassment of first- and second-party gaming riches. Uncharted 4, Horizon Zero Dawn, Marvel’s Spider-Man, God of War, Bloodborne, Death Stranding … all of them push at the possibilities of video games as a visual narrative medium.

  1. Sega Mega Drive/Genesis (1988)

By building an architecture capable of accurately converting arcade hits such as Golden Axe, Strider and Altered Beast, and bullishly marketing at teenagers, Sega made Nintendo look fusty and old-fashioned. This punk attitude was amplified further in 1991 by the arrival of Sonic the Hedgehog, a speed-obsessed, spiky-haired dude-bro perfectly in tune with early-1990s MTV culture. The Mega Drive would go on to sell 35m units and host a wide range of experiences from romantic role-playing adventures to real-time military sims. It wasn’t afraid to be weird and loud and rude, and some of us related hard to that. In the process, for better or worse, it invented the whole idea of console gaming as a lifestyle – an identity.

  1. Xbox 360 (2005)

The first console of the broadband era, Xbox 360 put online multiplayer functionality at the core of its offering from the very start. Innovations such as Achievements and the Gamer Score turned the global user base into one vast competitive community whether the battleground was Call of Duty 4: Modern Warfare, Halo 3 or Forza Horizon. Arguably, this was also the machine that saw the superior versions of epic adventures such as Mass Effect 2, Elder Scrolls V, Bioshock and Read Dead Redemption, ushering in our modern era of mature narrative game design.

  1. PlayStation 2 (2000)

It’s tough to win the console wars two generations in a row. PlayStation 2 didn’t just equal the success of PlayStation – it became the best-selling console of all time, shifting 155m units. Its utter dominance, its technical power and its familiar development environment allowed studios around the world to be extraordinarily creative. This was the golden era that saw mainstream blockbusters Grand Theft Auto, Metal Gear Solid, Gran Turismo, Pro Evo Soccer, Burnout and Ratchet & Clank come to fruition, but it also hosted idiosyncratic treasures such as Katamari Damacy, Ico and Okami, and through the Guitar Hero and Singstar titles it also became the post-pub entertainment platform of choice for a whole generation. This was where TV, movie and music creatives all woke up and realised, ah yes, games are the future, we’d better get in on this. And then everybody did.

  1. Super Famicom/Super Nintendo Entertainment System (1990)

Whole days in front of Street Fighter 2, the living room crowded with mates, coffee table loaded with snacks and Coke cans. All-nighters on Legend of Zelda: a Link to the Past, Super Metroid and Secret of Mana. Sharing Yoshi’s Island and Harvest Moon with your younger sister. Blasting through the Super Star Wars series. Discovering Donkey Kong Country. Millions of us have these memories. The SNES arrived in an industry already changed by the Mega Drive, but Nintendo stuck with what it knew – solid tech and astonishing, fecund creativity. The machine produced beautiful, colourful visuals and lush sampled sounds, and it had the flexibility to allow enhanced cartridges later in its lifecycle. But really, the lasting influence was all down to the games – more than 1,700 of them – and the way they made us feel. That is, in the end, what it’s all about.

Premodern Diversity vs. Civilizational Unity

 

Few Romans in the late decades of their 5th-century A.D. empire celebrated their newfound “diversity” of marauding Goths, Ostrogoths, Visigoths, Huns, and Vandals.

These tribes en masse had crossed the unsecured Rhine and Danube borders to harvest Roman bounty without a care about what had created it.

Their agendas were focused on destroying the civilization they overran rather than peacefully integrating into and perpetuating the Empire.

Ironically, Rome’s prior greatness had been due to the extension of citizenship to diverse people throughout Europe, North Africa, and Asia.

Millions had been assimilated, integrated, and intermarried and often superseded the original Italians of the early Roman Republic. Such fractious diversity had led to unity around the idea of Rome.

New citizens learned to enjoy the advantages of habeas corpus, sophisticated roads, aqueducts, and public architecture, and the security offered by the legions.

The unity of these diverse peoples fused into a single culture that empowered Rome. In contrast, the later disunity of hundreds of thousands of tribal people flooding into and dividing Rome doomed it.

To meet the challenge of a multiracial society, the only viable pathway to a stable civilization of racially and ethnically different people is a single, shared culture.

Some nations can find collective success as a single homogenous people like Japan or Switzerland.

Or equally, but with more difficulty, nations can prosper with heterodox peoples — but only if united by a single, inclusive culture as the American melting-pot once attested.

But a baleful third option — a multicultural society of diverse, unassimilated, and often rival tribes — historically is a prescription for collective suicide.

We are beginning to see just that in America, as it sheds the melting pot, and adopts the salad bowl of unassimilated and warring tribes.

The U.S. is now seeing a rise in violent racially and religiously motivated hate crimes.

The border is nonexistent.

Millions of unlawful immigrants mock their hosts by their brazen illegal entrance.

They will receive little civic education to become Americans. But they will learn that unassimilated tribalism wins them influence and advantages.

In contrast, America was once a rare historical example of a multiracial, but single-culture democracy that actually worked.

Multigenerational Americans were often energized by keeping up with new hard-working immigrants determined to have a shot at success in a free society long denied them at home.

Other large nations have tried such a democratic multiracial experiment — most notably Brazil and India. But both are still plagued by tribal feuding and serial violence.

What once worked for America, but now is forgotten were a few precepts essential for a multiracial constitutional state wedded to generous immigration.

One, America is enriched at its cultural periphery by the food, fashion, art, music, and literature of immigrants.

But it would be destroyed if such diversity extended to its core. No one wants Middle-East norms regarding gays or emancipated women.

No one prefers Mexican jurisprudence to our courts.

No one here wants the dictatorship of Venezuela or the totalitarianism of communist China.

Two, people vote with their feet to emigrate to America. They flee their native culture and government to enjoy their antitheses in America.

But remember — no sane immigrant would flee Mexico, Gaza, or Zimbabwe only to wish to implant in their new homes the very culture and norms that drove them out from their old.

If they did that to their new home, it would then become as unattractive to them as what they fled.

Three, tribalism wrecks nations.

Just compare what happened in Rwanda, the former Yugoslavia, or Iraq.

Anytime one ethnic, racial, or religious group refuses to surrender its prime identity in exchange for a shared sense of self, other tribes for their own survival will do the same.

All then rebrand their superficial appearance as essential not incidental to whom they are.

And like nuclear proliferation that sees other nations go nuclear once a neighboring power gains the bomb, so too the tribalism of one group inevitably leads only to more tribalism of others. The result is endless Hobbesian strife.

Four, immigration must be measured, so that newcomers can be manageably assimilated and integrated rather than left to form rival tribal cliques.

Five, it must be legal. Otherwise, the idea of citizenship is reduced to mere residency, while the legal applicant is rendered a fool for his adherence to the law.

Six, it must be meritocratic, so immigrants come with English and skills and do not burden their hosts.

And last, it must be diverse. Only that way, can all groups abroad have equal access to the American dream.

A diversity of immigrants also ensures that no one particular ethnic or political tribe seeks to use immigration to divide the nation further.

The old immigration once enriched America, but our new version is destroying it.

Can a New Speaker Reinstate an Old Norm?

 

After a little more than three weeks, House Republicans have finally elected a speaker. He’s Mike Johnson, first elected to the House in 2016 from a district in northwest Louisiana. He’s almost unknown to the public, has a right-wing voting record, and has been a supporter of Donald Trump. He grills witnesses effectively but calmly, with no visible anger.

How he does as speaker, whether he moves appropriations bills as promised, his skill negotiating with a Democratic president and Senate — all this is obviously unknown as I write. Unknown as well is whether his election with no dissenting Republican votes could help restore a political norm that has been eroded steadily in recent years.

The speakership is an odd political post, one of just two internal congressional offices mentioned in the Constitution. It was important to the framers, who remembered how in 1642, when King Charles I entered the House of Commons and demanded the arrest of five members, Speaker William Lenthall refused.

“May it please your Majesty,” he said, “I have neither eyes to see, nor tongue to speak in this place, but as this House is pleased to direct me, whose servant I am.” No monarch has entered the Commons chamber in the three centuries since.

In Britain, the speaker operates as a neutral arbiter; the current speaker was a Labour member chosen for his perceived fairness by a Conservative majority. In this country, where political parties, though dreaded and denounced by the founders, came into operation almost immediately, speakers are partisan officials, leaders of their parties with only ceremonial institutional responsibilities.

The longest speakerless intervals in the House came in the middle of the 19th century, as the Whig Party was dying out, the Know Nothing and Republican parties were founded, and the Democratic Party split between South and North.

In 1855, it took the House two months and 133 ballots to elect a Know Nothing speaker (a former Democrat, he later became a Republican), and in 1859 nearly two months and 44 ballots to elect a Whig (who was defeated in his home district in 1860).

RThe House’s 22 speakerless days this month, by comparison, was brief, as were the four days and 15 ballots it took to elect McCarthy last January.

The norm of all party members supporting the speaker chosen by secret ballot in conference (the Republican term) or caucus (the Democratic term) grew up in the stabler partisan environment emerging from the Civil War. The speaker, in turn, tended to choose committee chairmen and manage the flow of legislation to and amendments on the floor through control of the Rules Committee.

Speakers’ powers were whittled back in the 1910 revolt against Speaker Joseph Cannon; the progressive rebels, passed over for chairmanships for a dozen years, thought the fair way to allot them was by seniority, a system which by the 1950s weakened speakers and empowered anti-civil rights Southern Democrats.

That was reversed when House Democrats led by Phillip Burton in 1974 and House Republicans led by Newt Gingrich in 1994 instituted the election of committee chairmen by members and party leaders. Before those changes, there had been little point in withholding votes from speakers, and in any case Democrats won at least 243 seats — well above the 218 majority — in every House election from 1958 through 1992.

But in the 15 House elections starting in 1994, only once has a party won more than 243 seats, and with House rules requiring speakers to win absolute majorities of those voting, the potential has grown for violating the norm and voting against a party’s speaker candidate.

Thus in 1998, when Republicans lost rather than gained seats, a few holdouts said they wouldn’t vote again for Newt Gingrich, and after a memorable four-year speakership, he retired.

Among Democrats, violations of the norm became more common as some members sought to set themselves apart from the party’s liberal stands. But with the steely discipline exercised in her record 19 years as Democratic leader, Nancy Pelosi kept the numbers of norm violators down: 15 of 235 Democrats voted against her in 2019; just two of 222 Democrats did so in 2021.

No recent Republican leader had Pelosi’s strengths. In 2013, House Freedom Caucus members, encouraged by freshman Sen. Ted Cruz (R-Texas), imagined they could force non-funding of Obamacare on a Democratic president and Senate. Such shenanigans prompted the resignation of Speaker John Boehner in September 2015 and may have persuaded Speaker Paul Ryan to retire from Congress at age 48 in 2018.

So last January, enough Republicans felt free to vote against the conference’s clear choice, McCarthy, to deny him the speakership for 14 ballots. And in October, backbencher Matt Gaetz, with no non-personal discernible grievance, felt free to move to oust him as speaker, and with seven other Republicans and every Democrat, he did.

Each of the conference’s choices either bowed out or got fewer votes on each floor vote. Members with picayune cavils and even the chairman of the Appropriations Committee freely violated the long-standing norm.

Evidently, this spectacle of disarray prompted second thoughts. Mike Johnson, with a serious but not angry demeanor and with few enemies accumulated in his three terms, won not just a majority in conference but also got assurance no one would vote against him on the floor. No Republican did, for the first time in a dozen years.

So has the norm been restored? And what about the more important norm of accepting the outcome of presidential elections? In 2005, 32 House and Senate Democrats challenged George W. Bush’s victory on frivolous grounds.

In 2016 and 2017, Democrats concocted and pursued the Russia collusion hoax, attempting to delegitimize Donald Trump’s election and drive him from office. And on Jan. 6, 2021, after mobs pillaged the Capitol, 147 Senate and House Republicans voted against accepting states’ electoral votes.

Apologies and confessions of error have been forthcoming from few, if any, hoax perpetrators or election deniers. Any chance they will?

Monday, October 23, 2023

Where will all the laid-off tech workers go?

 

Tech layoffs have become a fact of life over the last year and especially so in the last few months, as tech firms big and small exact layoffs to reckon with their slowing growth after seeing record profits during the pandemic. What’s less certain is just where these tens of thousands of tech workers will go next.

The good news is that there are still many open jobs for these workers, not only within the tech industry but also, increasingly, outside of it. There’s also increased interest in starting new businesses. And while the layoffs will certainly contribute to some people’s decisions to leave the tech industry or go out on their own, it’s worth looking at the problems with the industry itself, from burnout to bad layoff practices, that are making it a little easier for people to choose a life after tech.

“What drew everybody to Big Tech is because they got crazy with the perks, and it was so sexy — and everybody got so intrigued by that,” said Kate Duchene, CEO of professional staffing firm RGP. “The downside is you’re laid off with an email at 3 am. Or the reason you found out you’re laid off is because your badge doesn’t work anymore.”

This is all part of a cultural about-face happening at big tech companies, which for years gobbled up high-skilled workers by wooing them with big paychecks and lavish perks. Now these companies are preaching austerity and asking their giant workforces to act like startups again. At the same time, tech giants have gone from being exciting places to work to not much different from the rest of corporate America, leading some to question just what they saw in the industry in the first place.

Though they’ve made a lot of headlines, the recent layoffs seem more like a course correction than a bubble bursting. That doesn’t mean it’s not painful. Already this year, 78,000 tech industry workers have lost their jobs, following 160,000 last year, according to Layoffs.fyi. But while the layoffs are hugely destructive to those involved, their numbers aren’t yet enough to put a real dent in the massive tech job market.

As a whole, the US tech industry, which includes companies like Google and Apple, added employees for the 25th consecutive month in December, according to data from industry association CompTIA. The number of people working in tech occupations — the association defines these as computer-related technical roles, like software developer, network engineer, data analyst — was at a record high of about 6.5 million that month, and their unemployment rate was near a record low of 1.8 percent, compared to 3.5 percent for all jobs. It’s certainly possible those numbers shifted in January, but tens of thousands of layoffs won’t move the needle much in an industry of millions.

Most people with tech occupations — 59 percent — don’t actually work in the tech industry, according to CompTIA. That figure has remained remarkably stable for the last decade. That’s because even as finance, health care, and retail companies started requiring more tech talent to help them digitize and automate their businesses, the tech industry — especially software development — grew, too. But the balance could tilt even further to non-tech industry companies in the months and years to come.

“Heading into 2023, if we see some of these shifts that are occurring right now, it would not surprise me to see that we do see a larger representation [of tech workers] outside of tech,” Tim Herbert, chief research officer at CompTIA, told Recode. He added that he doesn’t expect some huge exodus of workers from the tech industry, but given the size of tech employment, even a 1 percentage point change would be notable.

It’s important to remember that the tech industry employs all kinds of workers. While we don’t have a breakdown of what types of jobs tech companies have been getting rid of, it’s safe to say many of them are in jobs that don’t require a computer science degree, like human resources or sales. For example, while Google’s layoffs in California certainly hit people in engineering roles, it also included nearly 30 in-house massage therapists. For the employees who were laid off in recent weeks, their decision to find a new tech job, leave the tech industry, or start their own business might depend on what exactly they did in tech.

Workers with in-demand tech skill sets, namely engineers, will likely have the easiest time finding more work, wherever they decide to go. There were about 300,000 job postings for tech professionals in December, lower than its peak but roughly consistent with the past four years, according to a December 2022 report from tech hiring platform Dice. The biggest and fastest-growing industries for tech professionals are finance, manufacturing, and health care. Meanwhile, the list of biggest employers of tech talent includes big tech companies like Google and Amazon alongside corporate giants like Wells Fargo, General Motors, and Anthem Blue Cross.

“Given the scope of the downsizing in tech and the well-publicized reasons those decisions were made, we are likely to see many tech professionals think twice about taking their next role at either a tech giant or startup,” Nick Kolakowski, senior editor at Dice, told Recode.

Michael Skaff made the decision to leave the tech industry well before the current layoffs. He spent the first half of his 30-year career in a variety of IT jobs within the tech sector and the second half outside of it. He’s currently in the top tech role, CIO, at Jewish Senior Living Group, a health care management company. While he admits that the rate of technological change is much slower outside tech, he doesn’t think tech’s ethos of “move fast and break things” would be suitable in industries like health care, despite its need for technological change.

“There are ways to change within the existing flows of operations that allow for progress without disrupting or breaking something,” Skaff said. “You don’t want to break health care.”

To companies outside tech who couldn’t offer such high salaries or didn’t have the cultural draw of the Googles of the world, the present moment is a chance for them to hire the tech workers they’ve long wanted, if they can make themselves attractive enough. Those new hires still won’t come cheap, though. While compensation is still the most important thing driving tech workers to a job — it has been this way forever — the No. 2 item on that list is a newer addition, according to a recent Gartner survey shared with Recode: work-life harmonization.

“It certainly presents an opportunity for traditional employers — banks, retailers, health care companies — to tap into and maybe win back some of the employees that left them,” said Graham Waller, a VP analyst at Gartner Research.

These layoffs also present an opportunity for workers to strike out on their own. Applications to form startups last year were the second highest they’ve ever been, and tech workers are adding to that trend.

To Joe Cardillo, starting their own business was a way to make work better for themself and others. Cardillo, who had been managing marketing teams at tech startups and was over the “grind culture,” started their own management coaching firm, The Early Manager, after going through a series of “very stressful” layoffs since the start of the pandemic. So Cardillo took what they felt they did well at their former jobs; managing and teaching others to do so, and combined it with their ideas about how to build a good workplace, like giving employees more say in the conditions of their labor.

“I’m very interested in the idea of democracy at work,” Cardillo said.

That certainly feels like a far cry from the seeming brutality of recent tech layoffs, which have left many with hard feelings. Whether people will actually get better conditions or kinder treatment elsewhere remains to be seen.

We won’t know for years exactly where the workers affected by recent tech layoffs will end up. It’s possible that this is only a brief aberration in what’s otherwise a growing tech sector, or that people will eschew Big Tech to found startups that prove to be the next big thing — what many say happens during financial downturns but what might be more myth than truth. Or perhaps, every company truly is a tech company, and these layoffs put the rest of corporate America on more equal footing with tech.

David Jacobowitz had been working for tech companies pretty much his entire career, most recently in sales and marketing at TikTok, when he decided to voluntarily leave to pursue his passion: his own sugar-free chocolate business called Nebula Snacks. He’d been through his share of layoffs and knew that “loyalty is not necessarily rewarded.”

Beyond that, though, he realized that perhaps the tech industry just wasn’t for him.

“I looked at the trajectory and the lifestyle that I would have to live for the next 10 to 15 years if I wanted to climb the corporate ladder within tech and, when I really got down to it, I kind of answered the question: I don’t want to do that.”

Smart > Dumb

 

The sad truth is that some people in our party are stupid. Statistically, that is inevitable since the parties are about evenly split. When you have 50% of the population, you’re going to have a certain percentage of the idiots. Well, the Republican Party is undoubtedly statistically within the band of normal, because we sure have our share of nitwits. And I’m tired of them.

Look at the House of Representatives. However, some people aren’t willing to accept the reality. For a number of reasons, including stupid choices by our voters and dumb actions by party leaders and politicians, we have a mere five-seat majority in the House. We should’ve had a huge majority in the House, considering the historical trends and the political climate in 2022. Still, being the Stupid Party, the Republicans were stupid and only got five seats above even. It is what it is.

But some people in our party are too dumb to understand that it is what it is. They think it is what it isn’t. They think that because they believe really, really hard in their version of conservatism, their version of conservatism should prevail, even though their version of conservatism is not what the majority of the House believes. Keep in mind I share their beliefs. Their version of conservatism is my version of conservatism. I would love our version of conservatism to be ascendant. I’ve worked for that for nearly 40 years. But right now, it’s not going to be, in large part, because of our own party’s failures. But some in our party don’t seem to understand that. They’re very mad that a minority of the House – Democrats plus RINOs – outvote conservatives. As a result, we conservatives cannot simply impose our will. Apparently, they didn’t watch Schoolhouse Rock. Apparently, they didn’t learn how to count. Apparently, they think that because they’re really, really, really upset, their upsetness matters more than reality.

The reality is that we somehow managed to get a Speaker last January and that Speaker was imperfect. Kevin McCarthy was not the guy I wanted for the job in the abstract. He was not doing everything I wanted him to do. But you know what he wasn’t doing? He wasn’t being a Democrat. He was being an obstacle to Biden’s unilateral control over two branches of the government. And when you fail to win elections – I can’t even deal with people who are going to insist that we really won even though we didn’t, which I know because of who’s in office and who isn’t – you don’t get everything you want. 

This is why I prefer to win elections. Of course, winning elections requires that you do things that are consistent with winning elections. One of those is to not look like a bunch of clowns wrestling in Jell-O. The GOP can’t elect a Speaker because of a tiny minority of backbench tools, for a variety of stupid reasons, is working with the Democrats to ensure we can’t elect a new imperfect but adequate Speaker. How did this happen? Because a tiny minority led by Florida Beavis worked with the Democrats and tossed out our imperfect but adequate, Trump-endorsed Speaker. Why did they do that? Because of reasons and shut up and neocon and RINO and blah blah blah blah blah. Oh yeah, and “uniparty,” too.

You can tell dumb people by their promiscuous use of terms like “neocon,”  “uniparty,” and “RINO.” Words actually have meaning. They’re more than an avatar that represents something that you dislike. “Neocon” has a specific meaning. It’s not just somebody who thinks we have enemies overseas that must be confronted and defeated. Now it seems to be anyone who doesn’t want to withdraw the American armed forces within Kansas's borders. If you find yourself calling somebody who supports Israel over the seventh-century savages a “neocon,” you are choosing to be dumb.

How about “Uniparty”? That’s a cool, edgy term that stupid people use because they stole it from smart people who recognized the commonalities between Republicans and Democrats and how that conglomeration of creeps in Washington tended to work together across party lines to frustrate what the people actually wanted. It does not mean someone who wants a Republican body actually to function. It does not mean that you think that the Republican majority should have a Speaker, nor does it mean someone who thinks that the Republican Party probably shouldn’t throw out a Speaker unless it has another viable Speaker waiting in the wings who can get 218 votes. I know you don’t like Kevin McCarthy. I don’t like Kevin McCarthy. But I like having a Speaker because, among other things, it demonstrates to the voters that maybe they should trust Republicans with control of the House. And having a Republican Speaker who can control things in the House is the closest thing that we have at the moment to an obstacle to the actual uniparty. 

“BUT THEY CHANGED THINGS!”

Yes, for the worse. Getting rid of Kevin McCarthy is going to inevitably – assuming no miracle happens between the time I screamed/typed this and the time you read it, and the GOP gets its Schiff together – we’re going to have a bunch of soft Republicans from Biden districts hooking up with Hakeem Jeffries and installing a Democrat or Democrat-friendly Republican as Speaker, and that’s going to be a giant, pointy suppository for conservatism. Do you want to fight the uniparty? Maybe don’t help the Democrats and the RINOs work together to screw conservatives.

And that’s another abused noun, “RINO.” “RINO” has a specific meaning too. That meaning is not any Republican you just dislike this week. A RINO is a Republican in name only, a person with a Republican label who often runs in a purple district and would be a Democrat if not for the necessity of being a Republican to get elected. They tend to be liberal Republicans. But they are Republicans, and yes, they frustrate the hell out of me. But you know what? Our caucus includes about 15-20% RINOs. It just does. It does because the voters in their districts elected them. I’ve got no problem primarying them with candidates who can beat them – not lunatics who will lose big in a GOP-winnable district because they are convinced that the Kraken broke into the Dominion machines using psychic mind-control lasers. 

Maybe I’m expecting a little too much because I’ve been a Republican for about 40 years, and I kinda know what I’m talking about. You don’t have to agree with me or me with you. Still, we should expect a certain baseline of experience and understanding of how things are supposed to work, how things actually do work, what the Constitution says, how a bill becomes a law, and so forth, from somebody weighing in. I just can’t stand the nitwits who never thought about politics until last week, when they suddenly got excited and jumped in and then glommed onto all these terms that actually mean something and started spewing them out in a spray of hack clichés.

Advertisement

These newcomers tend to be emotion-driven. Those who actually want to win rather than express their innermost feelings, like a bunch of teenage girls at a slumber party, tend to be objective. We look at the situation, evaluate how it really is, and take the course of action most likely to benefit us. Sometimes, that’s not much benefit at all because, as adults understand, sometimes you win, and sometimes you lose. I prefer winning. You know, actual winning is where my guy takes office and is able to exercise power. I reject moral victories, where you lose gloriously and then whine about how the uniparty neocon RINOs are getting in the way of your faction exercising unrestrained power.

Grow the hell up.

The GOP Is at War With Its Base

 

For many years, the leadership of the Republican Party has been at war with the grassroots activists who work tirelessly on its behalf. This division has been apparent ever since Donald Trump descended the escalator in 2015 and announced his presidential campaign. 

The party establishment worked overtime to deny the nomination to Trump, yet the Republican voters overwhelmingly supported him. The resentment toward Trump and his supporters continued during his presidency and still exists today. 

Sadly, there is little chance this divide will be bridged any time soon. The latest example occurred last week as congressional Republicans were attempting to elect a House Speaker. 

The overwhelming favorite of the party’s base was U.S. Representative Jim Jordan(R-OH). He was the co-founder of the House Freedom Caucus and has served admirably as the House Judiciary Committee Chairman. His history as a stalwart conservative earned the trust of grassroots Republicans. 

Unfortunately, a contingent of moderate Republicans were steadfast in opposing Jordan. Despite their office phone lines burning up with calls from Republicans throughout the country, these obstinate opponents of Jordan refused to budge. By the third and final vote, 25 House Republicans opposed Jordan. 

This group includes the type of establishment Republicans who are steadfastly opposed to the Make America Great Again (MAGA) movement. According to Christopher Bedford, Executive Editor of CommonSenseSociety.org, these Republicans “would rather the speakership remain vacant than elect Jordan; they would rather break Congress than elevate someone they're not confident they can control. They believe money for Ukraine and other business-as-usual items are worth permanent institutional damage.”

Jordan would have been a different type of House Speaker, listening to the grassroots instead of the special interest groups. He would have been hesitant to add to the $113 billion already sent by Congress to Ukraine. 

Jordan would have also ended the practice of passing continuing resolutions which has been the standard operating procedure for 27 years. He would have insisted on Congress passing twelve separate appropriations bills, which is what former House Speaker Kevin McCarthy (R-CA) did not do. 

In January, McCarthy was finally elected House Speaker after 15 ballots because he promised the members of the Freedom Caucus he would pass separate appropriations bills. If McCarthy had kept his promises, he would still be House Speaker today. 

The Republican Party establishment was extremely comfortable with McCarthy but did not want to take a chance on Jordan, exposing the division in the Republican Party. In contrast, the Democrats are united. For example, in the latest House Speaker vote, every Democrat voted for their nominee, Congressman Hakeem Jeffries (D-NY). 

As late as the Clinton administration, there were a sizable number of moderates in the Democratic Party. Today there are very few moderates, and none wield any power. All the top leaders in the Democratic Party are far-left progressives who espouse socialist policies. 

The most powerful Democrat is U.S. President Joe Biden, who is overseeing the most progressive administration in American history. His policies are more left-wing than those advocated by the last Democrat President, Barack Obama. Thus, the Democratic Party, in the last twenty years, has moved to the extreme left ideologically. 

Unlike far-left Democrats, most Republican Party voters are Trump conservatives who believe in the MAGA agenda. Unfortunately, too many of the party’s leadership reject those policies and are more comfortable promoting a moderate political agenda. 

This agenda was embraced by both Bush Presidents, the late U.S. Senator John McCain, the 2008 Republican Party presidential nominee, and the 2012 GOP presidential nominee, U.S. Senator Mitt Romney (R-UT). This agenda includes support for open borders, generous government programs, globalism, a neocon foreign policy, and the aggressive extension of American military power. 

In addition, U.S. Senate Minority Leader Mitch McConnell (R-KY) holds those beliefs. McConnell, referred to by President Trump as the “Old Crow,” hates the MAGA movement. He does not believe in putting “America First” and addressing the problems of his countrymen. 

For McConnell and his fellow Republican neocons, the plight of average Americans is of no concern. Their top interest is in increasing the debt by boosting the military budget and making sure billions of dollars is spent on the war in Ukraine. 

In fact, just a few months ago, McConnell said that the war in Ukraine was “the number one priority for the United States right now according to most Republicans. That’s how we see the challenges confronting the country.”

Maybe McConnell and his fellow neocons believe that the war in Ukraine is the top priority, but most Republicans are concerned about the weaponization of the Department of Justice, the censorship of conservative ideas, the poor economy, the high crime rate, the influx of illegal drugs and the open southern border among other issues. 

In a recent Pew Research poll, the major concerns for Republicans were inflation, the budget deficit and illegal immigration. The war in Ukraine was not among the top 16 issues identified by Republicans. 

As Donald Trump Jr. tweeted, there is a disconnect between “actual Republicans and DC swamp rats.” The leadership of congressional Republicans and at the Republican National Committee (RNC) are “swamp rats.” 

They should listen to the grassroots of the party for their priorities are right on target. There are major alarm bells sounding for each one of the top three concerns of Republicans. 

Inflation is at 3.7%, increasing for three months in a row, and is almost triple the 1.4% rate at the end of the Trump presidency. In the last fiscal year, the budget deficit increased a massive $1.7 trillion, an increase of 23% from 2022.  

Of course, illegal immigration is a tremendous problem. In the last fiscal year, “a total of 3.2 million non-U.S. citizens attempted to enter the country illegally or be paroled into the country at different ports of entry.”

Loyal Republicans are screaming at party leaders to address real problems, but the “swamp rats” have other priorities. At some point, this disconnect will lead to the party’s dissolution.

The American Mainstream Press is Sinister and Deadly

 

The events of the past week and a half confirm what many on the right have long known. The American mainstream media is sinister and deadly. This has been going on for at least a decade and, quite likely, far longer.


The latest episode of the mainstream media's utter irresponsibility is publishing 'news' that Israeli Defense Force rockets fired on a hospital in Gaza filled with Palestinian doctors, nurses, administrative staff, patients, and visitors. 

This claim has been so thoroughly disproven that those who published this utter and despicable lie ought to retire from the profession if, indeed, you can still call it a profession.

Film footage of rockets being fired by  Hamas into Israeli territory shows that one of the rockets went off course and struck near the hospital where the alleged massacre took place. The rocket did not hit the hospital, as reported by the New York Times, and on down. 

The rocket hit a parking lot near the hospital, and many cars and other vehicles were damaged beyond repair. The damage to the hospital itself was minimal. Some people were hurt as a result of Hamas striking its own people.

Why Wait for the Facts?

Instead of waiting for the facts, which a responsible media would have done, the American mainstream media, full blast, accused Israel of this atrocity. What was their source? Hamas leaders themselves. 

These leaders claimed that Israel had fired on the hospital and had killed 250, then revised that number to 500, in an attempted genocide. Any reporter worth his or her salt would have investigated further. Any responsible publication would have ensured they had the facts before proceeding. But no, that's too much to ask of the American mainstream media.

Instead, they published this blatant lie and broadcast it around the world. They inflamed the Middle East in ways that actual war doesn't usually do. In one Arabic country after another, in the principal cities, there were demonstrations -- not just thousands, tens of thousands marching and chanting, against Israel for being monsters, genocidal, and wishing to eradicate the Palestinians by any means possible.

Not Nearly the First Time

If only this were the first time that the American mainstream media acted in sinister and deadly ways. Unfortunately, this is become de rigor with them. If you recall, more than a decade ago,  George Zimmerman, who is a Hispanic individual, was besieged in an apartment courtyard by Trayvon Martin, who was bashing Zimmerman's head into the sidewalk Zimmerman was finally able to reach for his gun and put an end to Martin's attempted murder, by killing Martin. 

What did the American mainstream press report? That Zimmerman was a white man who stalked Trayvon Martin and shot him in cold blood. As such, the press fanned the flames of racial hatred in America, which ended up costing some people their lives at other times and in other places

To this day, there are many on the Left who still believe this narrative, even though the facts of the case thoroughly document and thoroughly prove that it was, indeed, Trayvon Martin who was the aggressor and it was Zimmerman who was fighting for his life. 

A Career Criminal at 18

Not long afterward, Michael Brown in Ferguson, Missouri, allegedly said, "hands up, don't shoot," except that he never did. He tried to wrestle the gun away from a police officer while that officer was still seated in his own squad car. 

The press ran with the story that the police officer fired on Michael Brown, who was completely innocent. Ironically, video footage from a convenience store surfaced that showed just 10 minutes before; Brown had choked a clerk, much smaller in size than himself, when stealing from the store. 

Brown was a hardened criminal in the body of a youth who pranced around town as if he owned it. The press never reported that. As with Trayvon Martin and so many others, when they published a picture of Brown, they showed him in a high school graduation cap and gown. 

Hey, we all looked pretty good in high school and innocent in our ten and 12-year-old pictures. Why the press continues to use these angelic poses as opposed to how these perpetrators actually looked in the present speaks to their unending attempt at sinister reporting. 

To emphasize, when the press fans the flames of racism and racial hatred, people die. People die in riots, street skirmishes, fires, and random shootings, and via homemade bombs.

With Blood on Their Hands, Knowingly

The current sinister and deadly lie about Hamas rockets that the American mainstream media has presented to the world is going to kill thousands of people in the coming weeks, months, and years.

The truth will not emerge in those countries, and most of those people who are in the streets protesting against Israel and against Jews and against America will not know that the story was misreported. 

Many street demonstrators live in relatively closed societies where they do not get the full story if they get any accurate news. The lie spreads and spreads and becomes part of the prevailing lore that propels countries and peoples to hate Israel and hate America.

No Retraction

The American mainstream press has since found out that what they reported was completely inaccurate. The retractions, if any, will appear on page 16 in the fourth column way down on the page. There is no remorse on the part of Leftist newspaper publishers. They could not care less that they framed, condemned, and misrepresented Israel. They are happy to do so and, given the chance, will do so again and again. 

They are sinister and deadly, and there's no other way to put it.

Would You Rather Be Governed by Totalitarians or by Fools?

 

Washington, D.C., is the major problem in America today. That city is destroying the country because it is providing absolutely no leadership morally, fiscally, politically, or in any other measure true leaders need. A brief description of that cesspool follows, and, of course, it has spread its tentacles throughout the nation.

We have two political parties. One is a totalitarian party. The Democratic Party consists entirely of people who crave power, who want to control you, dominate your life, who simply believe that government is the source of all wisdom and authority, and that they are the ultimate possessors of the former and thus should have all of the latter. The Democratic Party is pure evil. There is no such thing as a “good Democrat,” not in political office. That is a contradiction of terms, an oxymoron. People who believe in murdering unborn babies, mutilating children for the pleasure of sexual perverts, who lie incessantly about things such as climate change and COVID vaccines, who allow men to be “women” and visa-versa; racists who permit crime to run rampant in their streets and let countless possible terrorists freely into the country to threaten the people they are sworn to protect and serve; degenerates who enact policies that are causing countless millions of Americans to struggle financially and live on the edge of poverty due to higher fuel, food, and housing costs, and who ultimately want to tell everyone what kind of vehicle they can drive or food they can eat—such people are not “good” by any definition of that word except their own warped, subjective standard. And they crave total power to enact the agenda and measures above, forcing you to submit. The only “freedom” Democrats believe in is what they will ALLOW you to have, and, of course, that is not “freedom” at all, as I can tell you from living in China. While its official name is “Democratic Party,” its true name is “Totalitarian Party,” because that, not democracy, is what they believe in.

On the other side of the aisle, there is the goofball Republican Party. We have come to facetiously call that party the “Stupid Party,” and the last week’s activities in Washington, D.C., is a perfect example of why. I wasn’t disappointed that Kevin McCarthy was removed as speaker of the House. But neither am I surprised at the current bungling, in-fighting, spite, and downright—stupidity—of the Republicans in the House as they flounder in trying to elect another speaker. The Totalitarian Party, in true totalitarian fashion, has voted 100 percent against any candidate (Jim Jordan especially) that the Republicans have tried to elect. But, of course, the Democrats couldn’t stop the Republicans from electing whoever they wanted to, if the Republicans would vote in solidarity. But the Republicans don’t do that. In one sense, that is not a bad thing; debate is good and healthy, something the Democrats never allow any more. But the perception of bumbling stupidity is NOT a good thing, and that is the major perception Republicans are now leaving with the American people.

I don’t believe every Republican in Congress is a fool. I think there are some good, decent, moral people in the Republican Party, even in Washington, D.C., who truly want to do the right thing for the American people, and defend the natural rights of life, liberty, and the pursuit of happiness that our Founders wrote a Constitution to protect. There are some non-fools in the Republican Party. Give me a few minutes and I might even be able to name one or two for you.
Advertisement

But that small group—whoever they are—certainly does not constitute the majority of the Republican Party. The Ronna McDaniels, Mitch McConnells, Lindsey Grahams, Mitt Romneys, Nikki Haleys, and other such Swamp Creatures Of Their Ilk are far, far too numerous, and continue to lead the party to defeat after disaster after futile, bungling failure. The Republican Party is simply not good at “government,” and that is not altogether a bad thing, either. We don’t want too much of that stuff. Government is ALL the Totalitarian Party wants, and they intend to put it before you and cram it down your throat. The Republicans don’t even know what is on their plate.

So, this leads me to the question in the title of this column: would you rather be governed by totalitarians or by fools? Sadly, that seems to be our only choice at the moment. Given a choice between those two, I will take fools over murderers any day of the week. Totalitarianism gives us no freedom. Stupidity is not a good thing, but unfortunately, true freedom does give people the right to be ignorant and act foolishly. Such is the cost of freedom—as long as it lasts.

The answer to this conundrum is in proper education. As Thomas Jefferson wrote, “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.” Because the American education system is totally in the toilet now, that “state of civilization” Mr. Jefferson spoke of is about to disappear. Barbarism, almost by definition, demands totalitarianism to control it. The Democrats know that, which is exactly why they are doing everything they can to turn barbarians loose in the country.
Advertisement

We can only vote for the people who run for office. And, in recent decades, the Republicans who have been on the ballot have tended to be…fools. They have succumbed to The Swamp. We need to find God-fearing, truly educated, intelligent, decent, moral, and courageous people to run for political office. We need those Americans in the majority in Washington, D.C. and state governments. I don’t know how to find them. But that’s what we need.

Otherwise, we will be governed either by totalitarians or by fools.

Why I started collecting videogames to defeat depression

 

I calmed down a bit and actually made some goals…instead of collecting every game out there…I’m collecting all NES/SNES/N64 games…both CIB and loose with a manual. It’s still a pretty crazy goal but…it’s easier then buying every game for every system.

But to dig into it…it’s a mix of nostalgia and obsession. I need something to focus and obsess on. If I don’t have something…I tend to get unfocused and just…end up in a not good place. I’m struggling with depression…so to have something to keep my mind off of my dark thoughts is useful. Gives me a goal to keep getting out of bed, keep struggling through the days, just so I can complete my collection and finally be satisfied. Of course I also play games…gaming helps my depression A LOT. It sucks me into another world…so much better then my own. I can forget my troubles for a few hours a day. Or I can make my own character and live my life how I always wanted. Older games can put me back to my childhood as well…when I was happier…

That’s the focus of my game room…making into something that resembles my childhood. So I can step into my game room, shut the door behind me and just get sucked back in time. I’ve done a lot of thinking into this. I’m filling it with toys and stuff from my childhood (not just video game related stuff), the furniture in that room is going to be from the era, the TV is going to be a CRT, the carpet is going to be arcade carpeting (which glows in black light), going to have glow in the dark stars up on the ceiling…heck I’m even going so far as to only play 80s music in there via cassettes and an old boom box (since I can’t rely on any local station to just play 80s music).

It’s fun. There’s a thrill when you get a really good deal, or find a really rare game in a $2 bin at a flea market or a thrift store. Your collection is more than just a bunch of games; it’s a collection of bundles that you bought with stories behind them. I think a lot of collectors are really collecting memories- like games they loved or really wanted. That’s definitely a big part of it.

There are a lot of collectors that go nuts cleaning their games but I like the character. I love finding old receipts or a handwritten note in a game with cheat codes, combos, etc or something, or to see a sticker from some towns local video rental store that’s been out of business for a decade. I like to leave those old Kmart stickers or sharpied names on there. I don’t know, maybe I’m weird with this one.

Also, most people have an arrangement to some console. When you collect and you have games for every console, it puts a smile on pretty much anyone’s face. People want to play those games, which makes picking up something like Mario Party or Fuzion Frenzy that much more fun.

Lastly, there’s nothing like a big, physical, expensive collection. Personally, I only buy games if I’m getting a deal. Every dollar I put in is at least doubled and grows over time. It feels good looking at a bookshelf filled with money. And it looks cool, too

There is something about the NES that first got me into video games, I like the look of the system, I like the music, I like the memories, I like the games, I like how they look displayed. There was a certain magic to it, and nostalgia plays a huge part. I do have Nostalgia for other systems, but nothing like the NES…not when it comes down to it. I played NES from like 7-12, which was basically the core of my childhood, in many ways, it’s the thing i did MOST in my childhood from like 1987-1992 or so. Playing NES helps me revisit many of these times, but I do enjoy it also. Many of these good times are with people who are no longer living, and it helps me remember them, b/c NES Was such a huge part of my childhood there.

I think collections are better when they are rarer, higher quality, more organized, and have some type of goal or theme in mind, they look better as a set or idea as a whole. I’d rather have a more well rounded NES collection than a bunch of other stuff, having less space space and a limited budget helps to force this aspect.

This is just my thinking though. I look at massive collections of tons of consoles and it doesn’t do anything for me. Do I enjoy them? Sure. But I don’t want to collect for them. I also know my tendencies, and they can get out of control if not kept in check with rules, i.e. too much stuff, too much money spent, etc.

I also collect vintage cameras, vinyl, and books, these collections started in my youth or from stuff from my family

The 25 greatest video game consoles – ranked!

 

  1. 3DO (1993)

The second true 32-bit machine after the FM Towns Marty, the 3DO was available via a unique business model: the 3DO Company (formed by Electronic Arts founder Trip Hawkins) licensed its technical specifications to third-party manufacturers such as Sanyo and Panasonic, which then built their own versions. Unfortunately, this approach made the hardware hugely expensive ($699 at launch – equivalent to $1,267 or £990 today) compared with rival consoles that could be sold at a loss by their manufacturers. There were some excellent titles, including the original Need for Speed and the strategy-shooter Return Fire, but the PlayStation killed it stone dead.

  1. Atari Jaguar (1993)

The veteran company’s final console boasted a powerful yet jumbled architecture based around two silicon chipsets (named Tom and Jerry) and a Motorola 68000 processor. Some say it was difficult to code for, it lacked a broad games catalogue and at $249, it was also very expensive. Now best known for the trio of excellent shooters Tempest 2000, Doom and Alien vs Predator, it remains an intriguing technical oddity.

  1. Sega Master System (1985)

Throughout the early 1980s Sega made several attempts to transfer its arcade expertise to the home console market – the Master System was the most successful. More powerful and with a fuller colour palette than the mighty Nintendo Entertainment System (NES), the eight-bit machine boasted decent arcade conversions, but is best remembered for its scrolling platformers, including Alex Kidd in Miracle World, Wonder Boy, Psycho Fox and an expertly reduced version of Sonic the Hedgehog.
Nintendo’s Switch Lite

  1. Intellivision (1979)

With its brown and gold chassis, wood effect lining and retro-futuristic controllers, Mattel’s Intellivision screamed “It’s the 1970s!” from every angle. But, developed a year after the release of the Atari VCS, it was a much more sophisticated machine thanks to a 16-bit central processor and generous 16-colour palette. Famous for its pioneering use of licensed sports titles and its convincing arcade ports (Burger Time, Donkey Kong Jr, Bump N Jump …) there were also intriguing original titles such as the weird operating-theatre sim Microsurgeon and B-17 Bomber, which came with a voice synthesiser for, ahem, “realistic” speech effects.

  1. PC Engine (1987)

In the 1980s most consoles resembled toys – the PC Engine, with its futuristic white chassis and cool mini-cartridges (or HuCards), looked like something out of Akira. Designed by electronics giant NEC and game developer Hudson Soft, the console contained twin 16-bit graphics chips that brought a singular aesthetic quality to arcade conversions such as R-Type, Splatterhouse and Ninja Spirit. Released later in the US as the Turbografx-16, it’s a genuine cult classic.

  1. Sega Saturn (1994)

Hitting Japanese shelves a fortnight before PlayStation, Sega’s 32-bit machine would be forever defined by its failed rivalry with Sony. Its fragmented internal architecture was built around Sega’s cutting-edge arcade machine technology, but developers needed expert knowledge of assembly language to wrestle anything out of it. Still, Sega’s studios triumphed with Virtua Fighter, Nights into Dreams and Sega Rally, while its plethora of stunning 2D shooters and fighting games thrilled hardcore gamers.

  1. Colecovision (1982)

With a Z80 processor three times more powerful than the Atari VCS and a huge 16KB of video ram, the Colecovision was a significant technological leap forward, allowing smooth animation and colourful visuals. Among its 125 games there were interesting original titles such as scrolling adventure Tarzan and Fortune Builder, an early SimCity predecessor. Later expansion modules let owners play Atari game carts and use a steering wheel controller. The machine is best known for excellent arcade conversions including Gorf, Zaxxon and Donkey Kong. Nintendo was so impressed that its head of R&D, Masayuki Uemura, used the Colecovision as inspiration for the NES.
The late Ralph Baer and his prototype of the first games console in 2009.
The late Ralph Baer in an image from 2009, with his 1960s prototype of the first games console. Photograph: Jens Wolf/dpa/AP

  1. Magnavox Odyssey (1972)

The first ever games console was developed by engineer Ralph Baer while working for defence contractor Sanders Associates under the attractive code name Brown Box. It consisted of a white and, yes, brown box containing just 40 transistors and 40 diodes, with wires connecting to the TV and two blocky controllers. The games, basically variants of Pong, had no sound and colour was achieved by placing plastic overlays on the screen – but the profound concept of interacting with graphics on your TV began here.

  1. SNK Neo Geo (1990)

Like Sega, SNK was a 1980s arcade giant with a desire to infiltrate the home console market, but its approach was much more ambitious. It set out to build a machine using exactly the same technology as its coin-op hits. This resulted in astonishing home versions of Fatal Fury, Art of Fighting and The King of Fighters; the only catch was the Neo Geo cost three times as much as its competitors and the 330-megabit game carts cost up to $200 each. No wonder it’s known as the Rolls-Royce of game consoles.

  1. Atari VCS/2600 (1977)

For a while, in the late 1970s, Atari was video games. After the success of its home Pong console in 1975, the company’s designers spent a year constructing a microprocessor-based system that could play games based on rom cartridges. During Christmas 1977, 400,000 machines hit US shelves and sold out almost instantly. The look of the machine, with its black fascia and wood panelling, and its simple eight-directional joystick, set the design ethos of the industry, while its games, with their beautifully illustrated boxes, were design classics in both form and function. Infiltrating pop culture via movies as diverse as Airplane! and Bladerunner, the Atari was more than a console – and even its failures, especially the botched film tie-in ET, became the stuff of legend.

  1. Nintendo GameCube (2001)

Beside the muscular design and technical potency of the PS2 and Xbox, with their internet connectivity and DVD support (the Xbox via an add-on), the GameCube looked like a giant Lego brick with a handle and its IBM PowerPC processor was underpowered compared to Microsoft’s machine. But Nintendo wanted a fun, characterful console that was cheap and easy to develop for – that’s what we got. Plus, Metroid Prime, Legend of Zelda: Wind Waker, Resident Evil 4 and Super Mario Sunshine competed with anything on those other machines.

  1. PlayStation 3 (2006)

Beset by difficulties and delays regarding its ambitious Cell processor and the inclusion of a Blu-ray drive, the PS3 was an intimidating project from the start. Its online multiplayer service was generously subscription-free yet inferior to the Xbox 360 offering and developers found it tough to work with the array of multiple synergistic processing units (SPUs). And yet this is where a new form of lush cinematic gaming experience flourished: Uncharted 2, God of War III, Demon’s Souls, Heavy Rain and Journey all pointed to a future of imaginative, challenging storytelling within rich immersive worlds. Their lessons are still being taught and learned.

  1. Nintendo 64 (1996)

Developed in conjunction with supercomputer specialist Silicon Graphics Inc and originally given the not-at-all hubristic codename Project Reality, the N64 was a contradictory beast – backwardly sticking with carts instead of embracing CD-roms but innovative in its use of an analogue joystick to allow accurate 3D movement. This, of course, led to Super Mario 64, the defining game of the era, but the console saw many other classics, including GoldenEye, Banjo-Kazooie, Wave Race 64 and Legend of Zelda: Ocarina of Time.

  1. Xbox One (2013)

Built more as a nicely boxed PC than a traditional console, the Xbox One boasts multicore AMD processors, HDR (and 4K video) compatibility, cloud storage, game streaming and a host of multimedia options. And just as the machine was beginning to age, its S and X iterations came along and boosted the specs. Alongside beautiful versions of multiplatform hits Witcher 3, Assassin’s Creed Origins and Fallout 4, it has also brought us Forza Horizon 4, Sea of Thieves, Halo 5 and Ori and the Blind Forest. State of the art, in many ways.

  1. Xbox (2001)

Teased by Bill Gates himself at the 2000 Game Developers’ Conference, Xbox was conceived by the team behind Direct X (Microsoft’s middleware for PC game developers) as a technological Trojan horse to get the company’s products into the living room. From those slightly Orwellian foundations came a robust, powerful and exciting machine, its back catalogue of more than 600 games boasting one of the greatest console first-person shooters ever in Halo, as well as gritty brawler Ninja Gaiden, surreal adventure Psychonauts and Star Wars epic Knights of the Old Republic. From the beginning, Xbox understood the rising importance of online play, with its integrated ethernet port and robust Xbox Live infrastructure. The most significant US-designed console since the Atari VCS.

  1. Nintendo Wii (2006)

Plenty of games industry pundits saw the tech specs for the Wii in 2005 and wrote it off as “two Game Cubes taped together”. What they hadn’t figured on was the unique joypad, which used motion controls providing barrier-free access to simple, perfectly designed games such as Wii Sports, Wii Play and Wii Fit. Suddenly, whole families could compete together, from the youngest to the oldest – and Nintendo sold more than 100m units as a result. A victory for utilitarian design over technological obsession.

  1. Sega Dreamcast (1998)

With its built-in modem and hugely innovative controllers (complete with removable memory cards doubling as handheld games machines), the Dreamcast was a visionary piece of hardware, backed up by Sega’s astonishingly creative first-party development teams. The machine saw an array of idiosyncratic titles – Jet Set Radio, Shenmue, Seaman, Rez, Phantasy Star Online – that either invented new genres or utterly revolutionised old ones. But the lack of support from western developers and the sheer might of the PS2 ensured that its life was as brief as it was beautiful.
Nintendo employees and the GameCube in 2001.
Nintendo employees and the GameCube in 2001. Photograph: Yoshikazu Tsuno/EPA

  1. Nintendo Switch (2017)

The follow-up to the interesting but flawed Wii U replicated that console’s two-screen approach, but made its built-in HD display truly portable, adding two motion-sensitive controllers into the mix. The Switch is a brilliantly flexible home console, seamlessly switching local multiplayer between the living room and the world at large, somehow combining the genius of the Wii and the Game Boy. Its best games – Legend of Zelda: Breath of the Wild, Animal Crossing: New Horizons, Super Mario Odyssey, Mario Kart 8 Deluxe, Super Mario Maker – exploit the delightful peculiarities of this machine perfectly.

  1. PlayStation (1994)

By 1993, Sony had failed to enter the console industry through collaborations with Sega and Nintendo, so the company’s hardware genius Ken Kutaragi thought, screw it, let’s make our own machine. He designed an architecture that was powerful yet easy to develop for and focused on pushing 3D shapes around the screen as efficiently as possible. Sony then solved its lack of development experience by purchasing UK studio Psygnosis and inking an exclusive deal with Japanese arcade veteran Namco. The resulting console ruled the 1990s, thrilling time-rich twentysomethings with titles such as Tekken, Gran Turismo and Tony Hawk’s Pro Skater. This machine changed everything.

  1. Famicom/Nintendo Entertainment System (1983)

It’s an oft-repeated fact, but always one worth remembering: throughout the 1980s in North America, you didn’t play “video games”, you played Nintendo. Like Hoover and Aspirin before it, the brand was so synonymous with the activity that it became genericised. This was down to the NES, a boxy, dated entry into the console market, which came with funny flat little joypads (heavily inspired by Nintendo’s successful Game & Watch handheld devices) and chunky carts. But the games, oh the games. Super Mario Bros, Legend of Zelda, Contra, Mega Man, Final Fantasy, Excitebike … This was where Nintendo’s (and specifically Shigeru Miyamoto’s) design genius originally flourished and where we learned the company’s maxim that old, well-used technology could be reformulated to realise amazing things.

  1. PlayStation 4 (2013)

Based around similar tech as the Xbox One and launched almost simultaneously, the PS4 saw Sony concentrating on games rather than multimedia functionality, immediately winning the PR war against Microsoft. With its excellent controller, vastly improved online infrastructure and seamless sharing and streaming functions, it’s an innovative system, but where it really wins is in its embarrassment of first- and second-party gaming riches. Uncharted 4, Horizon Zero Dawn, Marvel’s Spider-Man, God of War, Bloodborne, Death Stranding … all of them push at the possibilities of video games as a visual narrative medium.

  1. Sega Mega Drive/Genesis (1988)

By building an architecture capable of accurately converting arcade hits such as Golden Axe, Strider and Altered Beast, and bullishly marketing at teenagers, Sega made Nintendo look fusty and old-fashioned. This punk attitude was amplified further in 1991 by the arrival of Sonic the Hedgehog, a speed-obsessed, spiky-haired dude-bro perfectly in tune with early-1990s MTV culture. The Mega Drive would go on to sell 35m units and host a wide range of experiences from romantic role-playing adventures to real-time military sims. It wasn’t afraid to be weird and loud and rude, and some of us related hard to that. In the process, for better or worse, it invented the whole idea of console gaming as a lifestyle – an identity.

  1. Xbox 360 (2005)

The first console of the broadband era, Xbox 360 put online multiplayer functionality at the core of its offering from the very start. Innovations such as Achievements and the Gamer Score turned the global user base into one vast competitive community whether the battleground was Call of Duty 4: Modern Warfare, Halo 3 or Forza Horizon. Arguably, this was also the machine that saw the superior versions of epic adventures such as Mass Effect 2, Elder Scrolls V, Bioshock and Read Dead Redemption, ushering in our modern era of mature narrative game design.

  1. PlayStation 2 (2000)

It’s tough to win the console wars two generations in a row. PlayStation 2 didn’t just equal the success of PlayStation – it became the best-selling console of all time, shifting 155m units. Its utter dominance, its technical power and its familiar development environment allowed studios around the world to be extraordinarily creative. This was the golden era that saw mainstream blockbusters Grand Theft Auto, Metal Gear Solid, Gran Turismo, Pro Evo Soccer, Burnout and Ratchet & Clank come to fruition, but it also hosted idiosyncratic treasures such as Katamari Damacy, Ico and Okami, and through the Guitar Hero and Singstar titles it also became the post-pub entertainment platform of choice for a whole generation. This was where TV, movie and music creatives all woke up and realised, ah yes, games are the future, we’d better get in on this. And then everybody did.

  1. Super Famicom/Super Nintendo Entertainment System (1990)

Whole days in front of Street Fighter 2, the living room crowded with mates, coffee table loaded with snacks and Coke cans. All-nighters on Legend of Zelda: a Link to the Past, Super Metroid and Secret of Mana. Sharing Yoshi’s Island and Harvest Moon with your younger sister. Blasting through the Super Star Wars series. Discovering Donkey Kong Country. Millions of us have these memories. The SNES arrived in an industry already changed by the Mega Drive, but Nintendo stuck with what it knew – solid tech and astonishing, fecund creativity. The machine produced beautiful, colourful visuals and lush sampled sounds, and it had the flexibility to allow enhanced cartridges later in its lifecycle. But really, the lasting influence was all down to the games – more than 1,700 of them – and the way they made us feel. That is, in the end, what it’s all about.