Sunday, November 29, 2015

My Youtube did something ahead of its time

AN FRANCISCO — More people watch football on TV than play it. The same may be true for video games one day.
That may sound ludicrous to people who believe that playing a game is a lot more fun than watching somebody else do it, but that’s the prediction from a panel of esports advocates in a session at Casual Connect, a large conference on gaming in San Francisco.
Matt Patrick, the president of Theorist and creator of The Game Theorists channel on YouTube, has more than 45 million viewers a month on his channels. The size of his audience already sort of proves the point. Based on a poll he took of viewers, he noted that the Five Nights at Freddy’s, a horror title set in a Chuck E. Cheese’s-like setting, is a case where he believes more people are watching funny viral videos about the game."You are already there, and you should not be scare of this,” Patrick said. “Look at basketball. Some enjoy watching and engaging as a fan. A small subset played Five Nights at Freddy’s, but a huge number are engaging and buying things. They are sharing and talking about it. There are ways to monetize people who aren’t playing your games.”
“I believe the industry will accelerate in growth because there will be a lot more viewers than gamers,” said Peter Warman, the chief executive of market researcher Newzoo and moderator of the panel discussion.
Valve’s finals for the Dota 2 International Championships, which awarded $18 million for its top winners, drew tens of millions of viewers. But esports do have a long way to go before spectators obviously outnumber the players. And much of the time, it’s hard for newcomers to understand what is happening in a fast-paced game at first.
“It happens when games become understandable enough,” said Super Evil Megacorp chief operating officer Kristian Segerstrale, whose studio made the Vainglory mobile multiplayer online battle arena (MOBA) game. “Everyone has bounced a basketball. If you get to a game that is understandable after you watch it once, then we will get there. ”
He noted that in the course of a few months, Vainglory’s viewership on gameplay livestreaming site Twitch tripled in a few months.
“Our dream is to make the world’s first truly mass market esport,” Segerstrale said.

Thursday, November 19, 2015

Crazy Videogame budgets kills good games

#5. We Put People Who Don’t Know Gaming in Charge
Not to stereotype here, but the type of person who knows how to make an awesome video game about fighting dragons with a giant chainsaw tends not to be the same type of person who is an expert in business and finance. So if you look at the CEOs and executives of game studios today, you won’t find many that actually have professional experience working as game designers. And that would be fine, except for the fact that, due to the way games are made, these guys wind up making the creative decisions. It’s similar to the problem with big movie studios, only much worse.
The result is that the gaming industry is driven by aphorisms. For example, it’s an entrenched belief that the only truly successful games are branded titles, sequels, and reboots — that’s what the reports tell them. So what I found was that there was kind of a self-fulfilling prophecy at work. I was once told point-blank by a non-gamer CEO that “there’s no market for sci-fi games” in a certain genre — at the time, fantasy games dominated that market. And who’s to say he was wrong? When has switching from fantasy to sci-fi ever worked out?
It’s not that they know nothing about games; it’s that they know just enough to be wrong. Ever go to a game forum and notice how every player thinks he’s better than the designers? That combat would be perfectly balanced if the developers would just change that +2 to a +3 for his class? Now imagine that those people are running the business, and you have a pretty good idea of what the problem is — creating a perfect game looks easy from the outside, in the same way that from the outside it seems like it’d be really easy to make a snake. And those outsiders are in charge.
So why not just have the money folks focus on the business end of game design and let the actual designers do the designing? Well, it turns out “the business end” includes deciding how many hours of content are in the game, whether you have linear progression with lots of levels, how the DLC is going to work, whether powerful items get sold in the cash shop, whether the game is a first-person gritty realistic shooter with RPG elements or an action RPG with gritty realism and lots of guns … basically, everything that matters. The people who have the most experience and actually know what they’re doing are basically just polishing the ideas that the execs come up with.
We don’t like it any more than you do. I can tell you that professional game developers are some of the most hardcore gamers you’ll find, but for the most part, they’re just not allowed to work on the kind of games that interest them (I consider myself lucky to have dodged the bullet and gotten to work on a licensed property that I deeply loved). And while a professional developer will instinctively know — and I’m just pulling stuff out of my ass here — that it’s awful to require an online connection to play a single-player game or it’s asinine to have pre-order DLC before the game’s console is even out, the execs will think those are great for the bottom line, and they make the rules.
“Historically, the best thing about SimCity games is the Internet.”
And it’s only getting worse, because …
#4. Budgets Have Gone Insane, and That’s Making Innovation Almost Impossible
Nobody feels sorry for corporations, so when you hear EA or Activision groaning about how games cost too much to make, it feels like the consumers won. “That’s right, now put it on a gold-plated disc and do a little dance while you’re handing it to us, Mr. Blizzard!” But these budgets are, unquestionably, making games worse.
Let’s say you’ve been put in charge of planning a child’s birthday party, for some reason — maybe you lost a bet or something. You’ve got one day to plan, a $50 budget, and five people to help you. Not a big deal, right? Put some balloons in the yard and hire a clown. Done. But what if that party was for a rich kid and your budget was $50 million? Do you think that makes it easier or harder? Let’s put it this way: Instead of five friends helping you, it’s 500 strangers, and all of them have different ideas about what a party should look like. How long until you see your first fistfight break out? How far into the party before you hear yourself scream, “OK, who hired the stripper?!?”
Well, in the world of game development, this change from small-scale projects to massive productions happened overnight — the average game costs freaking 30 times as much as it did in the days of the original Sony PlayStation. Back then, the average game could be made for $800,000 on the low end, but by the PlayStation 3 era, the number had ballooned to $28 million. With the new consoles, that’s going to go up again. At this point, it’d be cheaper to just create real zombies to chase people around.
In the older, simpler days (way back in the 1990s), you probably had a core team of a dozen developers and one or two vision holders who could keep the entire design in their heads. The producer had about nine months of planning until launch. Fast forward to today: Star Wars: The Old Republic cost upward of $500 million. Games require massive teams (some of them in another country) and years of development, and that’s not including the umpteen false starts during preproduction. And this change happened too fast for studios to adapt — they’ve refused to change with the times and are still operating with the same basic structure.
Not that it’s any easier on their end. Remember, it not just the development that’s changing so fast, it’s distribution as well — today, studios are competing in a market that’s permanently saturated by $3 indie games on Steam and the nostalgia-driven GoG.com. Time-honored marketing strategies don’t work anymore, and the industry is struggling to find a replacement. Even though it feels like prices have gone up, if you take inflation into account, right now games are the cheapest they’ve ever been. Add it all up, and most studios are one failed game away from bankruptcy.
So from the gamers’ end, it’s easy to complain that the market is saturated with first-person shooters (the new consoles are picking up that banner with Killzone: Shadow Fall, Titanfall, and Destiny, in addition to the uninterrupted stream of Call of Duty and Battlefield games), but the fact is that the market is utterly reliant on those games’ sales. The consistent success of go-to franchises like Madden NFL is probably the only thing separating the current industry from a 1983-style crash right now. It’s not that they’re playing it safe by going back to the same well again and again — they’re doing the only thing that will let them survive.
#3. Publishers Are Gaming the Review System
Raise your hand if you’ve paid $60 for a heavily promoted game that got near-perfect review scores, only to find it to be a frustrating, cookie-cutter mess that had you doing a mental inventory of all of the things you could have bought with the cash instead (“three remote control flying sharks!). Do the critics, like, get a different version or something?
This is a huge problem from the consumer end — games are a much bigger time and money investment than movies, books, or any other media, so having honest reviewers you can rely on is crucial. You’re trying to get an opinion on what might be the only game you buy for the next couple of months, but it’s becoming increasingly apparent that critic scores and user scores just don’t sync up. Call of Duty: Ghosts currently has a score of 74 on Metacritic — not a fantastic score for a AAA game with that kind of budget, but check out the average user score: 2.3.*
Is that just a bunch of young gamers throwing a tantrum because they thought the game would have actual ghosts in it? Well, read the reviews — the critics’ write-ups boil down to “It’s a recycled version of the old games, but still good,” while the users’ consensus is “$60 is a lot to pay for recycled material, guys.” You can see that same divide with lots of games — Total War: Rome II had a respectable 83.5 score at launch (currently down to 76), but the user reviews? 3.9/10. Mass Effect 3 is at 89 for critics vs. 5.0 for users, the latter group being way less forgiving of an ending that rendered every previous choice in the franchise meaningless.
So why do the critic scores skew so much higher? Well, behind the scenes, studios are doing everything they can to obtain the highest Metacritic score possible at launch — some teams even get bonuses for hitting Metacritic targets. From the publication’s standpoint, those reviews exist to bring website traffic. That traffic turns into revenue from advertisements … that were purchased by the studios whose games are getting reviewed.
If you give the game an award or especially high praise, the publication could appear on that game’s box — the reviewers aren’t paid, but they get valuable exposure. Basically, the publishing companies are paying the review site’s bills by buying ads and handing out free publicity, so from the struggling writer’s perspective, it’s bad business to give bad reviews.
Journalists are invited to the studio or a rented room at a convention. They play the most polished level and/or segment of the game for a couple hours, maybe over the course of a few days. Drinks and meals are on the house. Keep in mind that they’re getting dropped into the middle of the game somewhere, because complicated gameplay that builds on lessons learned in previous levels would be extremely frustrating, whereas you want the journalists to experience fun and excitement. So we’re talking graphics, simple combat, flashy cinematics, and controlled linear environments that look really good — as long as the journalists never stray from the path, which is why there are marketing execs looking right over their shoulders and telling them where to go. And it’s amazing how intuitive level design becomes when the guy who designed the level is there to explain it.
But that disparity between user scores and critic scores is going to catch up with us, and it won’t just be the critics who get bitten in the ass. If the gamers don’t have any critics to trust, they’re going to stop buying games the day they come out. And since the industry puts so much emphasis on launch day box sales, that’s going to look an awful lot like a crash to the people in charge.
Every gamer has had the experience of seeing some amazing preview trailer for a game set to come out a year or so later, only to have the game arrive and look nothing like that preview (although some are worse than others):
Well, think about your hypothetical child’s birthday party earlier. Imagine going through that hectic process, only to find that the guest list changes about once an hour. Suddenly you have to accommodate more kids. The backyard you planned to use isn’t big enough. The one clown you hired won’t be enough to terrify all of them. So you rethink your plans, some of which means re-doing work you’ve already done. Then, an hour later, it changes again — this time it’s not just more kids, but different kids. Some of them have peanut allergies. Each little change means you have to completely re-think what you’re doing.
Well, in the world of video games, it’s the hardware that’s changing under our feet — beyond having new consoles every few years, new video cards for PCs are arriving constantly. To use each feature on a given video card, you write render code, and sometimes you’re writing render code for hardware that isn’t even out yet. Remember, it can take several years to make a game, and in that time the available hardware is going to change multiple times. You can be pretty far along in the process and still be unsure of how the graphics are going to look … in an industry where the main selling point is graphics.
So, once the coders figure out how to implement the features, the artists have to make assets, and then designers are given all of those beautiful set pieces and told to “add gameplay.” Put it all together, and 80 percent of the work on a game generally gets done in the last 20 percent of the schedule. Up until then, it’s all promises: hardware developers promising that their shit will be able to do what they say, level designers promising they’ll be able to use it, programmers promising they won’t have too much stuff to fix, and the whole time the company is releasing promotional material that they totally promise is what the final game will look like.
Right now we’re in the middle of a new console launch — these periods are the worst of all. Remember, all of the games available at launch started development long before Sony, Microsoft, or Nintendo even knew what their consoles were going to look like. It takes so long for developers to catch up and figure out the hardware that by the time the true power of a console is discovered, it’s time to start all over — an unfortunate reality that is apparently deliberate.
I’m not exaggerating when I say that, for the vast majority of studios in the game industry, working conditions are awful, and the burnout rate is pandemic. The frantic period before a game releases is called crunch time, and it’s marked by 60- to 80-hour weeks. And “crunch time” can last up to a year. No, it’s not as bad a job as that guy who has to crawl up the constipated elephant’s butt at the zoo, but it’s bad enough that it’s hurting the games. The best and brightest veterans get driven away.
It’s easy to see why. Around your late 20s and early 30s, most people are looking to settle down and start a family. This means a stable, secure job that doesn’t demand 12-hour days plus weekends, with no overtime and the looming threat of layoffs after the project ships. Producers and programmers can easily work on commercial software for banks or oil companies, and designers can easily transition into anything related to front-end user experience — and in both cases, they’ll be making way more money for way less work.
So the game industry is exporting experienced game developers while importing businessmen with no game industry experience to oversee the starry-eyed inexperienced juniors who remain. The only ones who tend to stay behind are the artists, because they’re stuck with highly specialized skills (like animating boobs), which skews the industry’s expertise in that direction.
The result of all of this isn’t that games are going away — it’s just that the industry may look very different a few years from now. That might not be a bad thing.

Monday, November 09, 2015

Free Play on Linux Wine frontend

I have found a free alternative to Crossover Linux frontend called PlayonLinux.  I installed the deb on Linux Mint 17.2 and it installed all the dependencies for me.


Wednesday, November 04, 2015

the new American dream



American dream has changed. It used to be a college education, a steady job, a nice house (and a family to fill it), and a better financial picture than your parents. There is a new American Dream that is still about “doing better than your parents” but not in a financial sense. This dream is about fulfillment.

Boston-based artist, Ariel Freiberg, just got engaged, and she and her fiancé are gearing up for this new dream. “We were brought up to think it’s important to own a piece of property. It’s how you build your life in this country. But buying a house is not a major goal for us. It is not what will make our lives secure and it will not help us define ourselves.”

“The idea of the American dream is taken out from under us,” explains Anya Kamenetz, blogger and author of the book Generation Debt. “There used to be a contract with employers — healthcare, pensions, predicable employment,” but today there are none of those guarantees.

Additionally, the cost of a college education is far outpacing inflation, making it more difficult to make this first steps toward the American Dream, according to Tamara Draut, author of Strapped: Why America’s 20- and 30-somethings Can’t Get Ahead. The average student loans come to around $20,000, which means $200 a month out of an entry-level paycheck. On top of that between 1995 and 2002 median rents in almost all major cities have increased more than 50%.

Hillary Clinton recently gave a speech about how “a lot of kids don’t know what work is” and young people “think work is a four-letter word.” These were not renegade words, but rather an expression of the prevailing attitude among her fellow baby boomers.

The boomers mistake a rejection of their American Dream as a rejection of reality. But here’s some news: Young people know that work is a reality for everyone. It’s just that everyone needs to work toward something; so young people have a new American Dream.

“The new American Dream is much more entrepreneurial,” says Kamenetz. “And it’s about shaping ones own destiny: mobility, flexibility to do your own work and the ability to have a career as an expression of who you are as a person.

Here are some things to keep in mind as you craft your own version of the new American Dream:

1. Cushion an entry-level salary with a move back home.
The first step in restructuring the American Dream is to save money to ensure flexibility. Moving back with your parents is smart if you can do it. Most jobs are in big cities, and starting salaries simply cannot pay the rent in those cities. People who are not able to get subsidized housing from parents are much more limited in terms of their early career choices.

2. Get comfortable with risk taking.
The new American Dream is for risk takers. This is actually not groundbreaking in terms of the American Dream. For immigrants, the American Dream has always meant risk-taking. But today young people are taking risks that parents would have never dreamed of, like playing contact sports without any health insurance and signing up for a mortgage with a freelance career.

3. Protect your time.
The American Dream of Baby Boomers came at the expense of personal time and family time. Success is not having more things than your parents. It’s having more time. More time for hobbies, for travel, for kids. “It’s not about how much money you have, it’s about living your life on your own terms,” says Barbara Stanny, financial coach and author of Overcoming Underearning.

4. Don’t assume personal fulfillment requires a small career.
Sure, the new American Dream has nothing to do with financial studliness. But don’t sell yourself short in the name of personal time. “Higher earners with balanced lives don’t work more hours, they are just more focused,” says Stanny. “To make more money you don’t have to work more hours. There is a difference between settling for a low income and taking a job to feed your soul.”

5. Buy as small a home as you can.
You preserve the most options for your future if you can buy a home on one income. “The advice used to be: always buy the most expensive house you can afford because it’s an investment. Today it’s different. Buy only the amount of house that you need so it doesn’t become an albatross around your neck.” says Phyllis Moen, author of Career Mystique: Cracks in the American Dream.

6. Make decisions by looking inside yourself.
Be aware of the tradeoffs you’re making. For example, big cities are exciting and filled with career opportunity, but you pay a high premium for living there.

When talking about her decision to stay in Boston, Freiberg says, “There’s a certain vibration living in the city that feeds me and my fiancé — this inspiration is something that we can’t get in the suburbs.”

Choices are difficult today because the new American Dream is not as measurable as the old one. You cannot look at your bank statement or count your bedrooms to assess your success. The new American dream is about fulfillment, which is a murky, slippery goal, but young people like Freiberg know it when they feel it, and you will, too.

Monday, November 02, 2015

Early adult life is best if you are lost .

It used to be that the smart kids went to graduate school. But today, the workplace is different, and it might be that only the desperate kids go to graduate school. Today there are new rules, and new standards for success. And for most people, graduate school is the path to nowhere. Here are seven reasons why:

1. Graduate school is an extreme investment for a fluid workplace. If you are graduating from college today, you will change careers about five times over the course of your life. So going to graduate school for four years—investing maybe $80,000—is probably over-investing in one of those careers. If you stayed in one career for your whole life, the idea is more reasonable. But we don’t do that anymore, so graduate school needs to change before it is reasonable again.

2. Graduate school is no longer a ticket to play. It used to be that you couldn’t go into business without an MBA. But recently, the only reason you need an MBA is to climb a corporate ladder. And, as Paul Graham says, “corporate ladders are obsolete.” That’s because if you try to climb one, you are likely to lose your footing due to downsizing, layoffs, de-equitization, or lack of respect for your personal life. So imagine where you want to go, and notice all the people who got there already without having an MBA. Because you can do that, too, in a wide range of fields, including finance.

3. Graduate school requires you to know what will make you happy before you try it. But we are notoriously bad at knowing what will make us happy. The positive psychology movement has shown us that our brains are actually fine-tuned to trick us into thinking we know about our own happiness. And then we make mistakes. So the best route to happiness is one of trial and error. Otherwise, you could over-commit to a terrible path. For example, today most lawyers do not like being lawyers: more than 55% of members of the American Bar Association say they would not recommend getting a law degree today.

4. Graduate degrees shut doors rather than open them. You better be really certain you know what you’re going to do with that degree because you’re going to need to earn a lot of money to pay it back. Law school opens doors only to careers that pay enough to repay your loans. Likewise, your loan payments from an MBA program mean that you cannot have a scrappy start-up without starving. Medical school opens doors to careers with such bad work-life balance that the most popular specialty right now is ophthalmology because it has good hours.

5. If you don’t actually use your graduate degree, you look unemployable. Let’s say you spend years in graduate school (and maybe boatloads of money), but then you don’t work in that field. Instead, you start applying for jobs that are, at best, only tangentially related. What it looks like is that you are asking people to give you a job even though you didn’t really want to be doing that job. You wanted another job but you couldn’t get it. No employer likes to hire from the reject pile, and no employer wants to be second choice.

6. Graduate school is an extension of childhood. Thomas Benton, columnist at the Chronicle of Higher Education, says that some students are addicted to the immediate feedback and constant praise teachers give, but the work world doesn’t provide that. Also, kids know how to do what teachers assign. But they have little idea of how to create their own assignments—which is what adult life is, really. So Benton says students go back to school more for comfort than because they have a clear idea of what they want to do with their life.

7. Early adult life is best if you are lost. And I was lost my first four years of adulthood so I picked college. It used to be that you graduated from college and got on a path. The smart kids got themselves on a safe path fast. Today there are no more safe paths, there is only emerging adulthood, where you have to figure out who you are and where you fit, and the quarter-life crisis, which is a premature midlife crisis that comes when people try to skip over the being lost part of early adult life. Being lost is a great path for today’s graduates. And for most people, graduate school undermines that process with very little reward at the end.

Dan Ariely, economist at MIT, found that when people have a complicated choice to make—and there is a default choice—they pick the default nearly every time. So if your parents or friends went to graduate school, you are likely to do the same, not because it’s good for you personally, but because choosing the alternatives seem more difficult. But making exactly that kind of difficult choice is what your early adult life is all about. So don’t skip it.