ayjay + games   10

Tadashi Tokieda Collects Math and Physics Surprises | Quanta Magazine
Sometimes adults have a regrettable tendency to be interested only in things that are already labeled by other adults as interesting. Whereas if you come a little fresher, and a little more naive, you can look all over the place, whether it’s labeled or not, and find your own surprises.

So, when I’m washing my hands with my child, I might notice that if you open a faucet very thinly — not so that it drips, but a thin, steady stream of water — and you lift your finger gradually toward the faucet, you can actually wrinkle the water stream. It’s really fantastic. You can see beadlike wrinkles.

It turns out that this can be explained beautifully by surface tension. And this was known to some people, but 99.9% of the world population hasn’t seen this wrinkling of the water. So it’s a delightful thing. You don’t want to let go of that feeling of surprise.
games  math  science 
6 weeks ago by ayjay
Video Games Are Better Without Stories
The true accomplishment of What Remains of Edith Finch is that it invites players to abandon the dream of interactive storytelling at last. Yes, sure, you can tell a story in a game. But what a lot of work that is, when it’s so much easier to watch television, or to read. A greater ambition, which the game accomplishes more effectively anyway: to show the delightful curiosity that can be made when stories, games, comics, game engines, virtual environments—and anything else, for that matter—can be taken apart and put back together again unexpectedly.

To dream of the Holodeck is just to dream a complicated dream of the novel. If there is a future of games, let alone a future in which they discover their potential as a defining medium of an era, it will be one in which games abandon the dream of becoming narrative media and pursue the one they are already so good at: taking the tidy, ordinary world apart and putting it back together again in surprising, ghastly new ways.
games  from instapaper
april 2017 by ayjay
Faculty Spotlight: Erik Hurst | Becker Friedman Institute
In this strand of my research, I’m almost flipping that theory on its head by asking if it is possible that technology can also affect labor supply. In our culture, where we are constantly connected to technology, activities like playing Xbox, browsing social media, and Snapchatting with friends raise the attractiveness of leisure time. And so it goes that if leisure time is more enjoyable, and as prices for these technologies continue to drop, people may be less willing to work at any given wage. This explanation may help us understand why we see steep declines in employment while wages remain steady – a trend that has been puzzling economists.

Right now, I’m gathering facts about the possible mechanisms at play, beginning with a hard look at time-use by young men with less than a four-year degree. In the 2000s, employment rates for this group dropped sharply – more than in any other group. We have determined that, in general, they are not going back to school or switching careers, so what are they doing with their time? The hours that they are not working have been replaced almost one for one with leisure time. Seventy-five percent of this new leisure time falls into one category: video games. The average low-skilled, unemployed man in this group plays video games an average of 12, and sometimes upwards of 30 hours per week. This change marks a relatively major shift that makes me question its effect on their attachment to the labor market.

To answer that question, I researched what fraction of these unemployed gamers from 2000 were also idle the previous year. A staggering 22% - almost one quarter – of unemployed young men did not work the previous year either. These individuals are living with parents or relatives, and happiness surveys actually indicate that they quite content compared to their peers, making it hard to argue that some sort of constraint, like they are miserable because they can’t find a job, is causing them to play video games. The obvious problem with this lifestyle occurs as they age and haven’t accumulated any skills or experience. As a 30- or 40-year old man getting married and needing to provide for a family, job options are extremely limited. This older group of lower-educated men seems to be much less happy than their cohorts.
economics  games  tech 
july 2016 by ayjay
The best new board games we played in 2015 | Ars Technica
Most Euro-style board games operate on the idea of players building an “economic engine.” You start with nothing and by collecting resources, you buy items that give you more resources, which lets you buy better things, which gives you even more resources—until you’re running an unstoppable point-generating machine. Splendor takes this immensely gratifying gameplay loop and pares it down to something even your grandmother can enjoy.

On your turn, you’ll collect gems from a common supply in order to buy cards from a central market. (The gems are represented by heavy poker-style chips that are absurdly satisfying to hold and play with.) The cards give you permanent gems that act as discounts on future purchases. You’ll work your way up through three tiers of ever-more-expensive cards to buy cards worth more and more points. The first person to 15 points triggers the end of the game.

Of all the smash-hit tabletop games released in 2014, Splendor is probably the one with the most staying power. It’s dead simple to teach, but it has enough strategy that seasoned gamers can happily play alongside noobs. It takes about 30 minutes to play, but it’s as fulfilling as many longer games. When I introduced the game to my Friday night game group, we didn’t play anything else for about two months. Splendor is an instant modern classic.
games  wishlist 
january 2016 by ayjay
Winning Isn’t Everything — Ian Bogost
Real systems thinking assumes simple answers are always wrong. Yet when we talk about the future—even the future of games or of systems literacy—we tend to assume that they will unleash their transformative powers in a straightforward way, through ideas like a century with a dominant medium. We are meant to speak like Pollyannas about “changing the world,” rather than admitting that the very notion of changing the world is anathema to the fundamental promise of systems literacy, namely a rejection of simplicity and a distrust of singular answers. After all, it’s not clear at all that the 20th century is best summarized as a century of the moving image, anyway. Too much happened to pin down a single influence or media form as dominant. Systems thinking would force us to admit that any singular innovation is caught up in a web of others. We could just as easily call the last century the “electric century,” because so many of its inventions and innovations were bound up in the rollout and use of electric power. Or perhaps the “recorded century,” because photography, phonography, and other methods of analog capture and preservation rose to prominence (eventually fusing into film) — not to mention digital information storage. Cinema itself relied on the rise of leisure and the desire for escape, facilitated by two decades of economic catastrophe and war during the Great Depression and World War II. Those features were only further amplified by the rise of suburbanism and automobile culture of the 1950s, where cinema coupled to youth, desire, and freedom.
tech  games  futurism  history 
december 2014 by ayjay
Public Books — Behind the Dungeon Master’s Screen
Lipsyte—who, incidentally, once sang in a noise-punk band called Dungbeetle—suggests that D&D, in its canonization as a reputable after-school activity rather like chess or robotics club, may risk losing its garish subcultural soul. The 40th anniversary of the creation of Dungeons and Dragons has occasioned a newly respectful discourse about role-playing games. Fantasy gaming, like so many other subcultures before it, has passed through what could be called the four stages of cultural recuperation. Initially ignored by the mainstream, then feared as a dangerous influence, then ridiculed as a sad waste of time for nerds and losers, gaming is now respectfully celebrated. We hear the ringing tones of a newly positive discourse of teaching and learning, of role-playing games as useful instruction. Not simply a fun way to while away a weekend afternoon, Dungeons and Dragons, as Díaz himself has said, serves as a “storytelling apprenticeship” for authors, a “formative narrative media.” For others the game has been a literary tutorial that can “help build the skills to work collaboratively and to write collaboratively.” The novelist and editor Ed Park “celebrates the magnificent vocabulary of the game” (“melee,” “thaumaturge,” “paladin,” “charisma,” “homunculus”); the founder of two “successful technology companies” even asserts that “Dungeons & Dragons helped train him for the rigors of tech entrepreneurship.”
fantasy  games 
december 2014 by ayjay
The bizarre, mind-numbing, mesmerizing beauty of “Twitch Plays Pokémon” | Ars Technica
It could even be that Twitch Plays Pokémon is a bleak-but-perfect summary of the human condition—a group of people unified behind a common cause that struggles and fails to accomplish even the most basic tasks. We ostensibly want the same thing, yet we expend Herculean amounts of effort only to end up right back where we started—at best. And that's the case even without considering the people who are only out for themselves.

In any case, Twitch Plays Pokémon encapsulates the best and worst qualities of our user-driven, novelty-hungry age. Today's Internet has an extraordinary propensity for creating things that (1) grow quickly, virally, and organically through word of mouth, (2) provide hours of entertainment, and (3) waste days of peoples’ lives for no apparent purpose (see also: Flappy Bird).

Twitch Plays Pokémon is a value-free time sink and it's deeply, deeply stupid. I know that. But if you know how to stop watching, you're doing better than I am.
games  socialmedia  tech 
march 2014 by ayjay
Perpetual Adolescence | Ian Bogost
This is an unpopular opinion. Gone Home has been met with almost universal praise in the gaming community, a world where numerical scores on a ten-point scale mean everything, and where Gone Home has achieved mostly 9s and 10s. After playing, dude-bro game dev celebrity Cliff Bleszinski gushed, "This game moved me in a way that I've never been moved by a game before." Lesbian, queer, and transgender players—an increasingly vocal and welcome counterpoint to traditional straight male voices in game development—penned love letters to the game, expressing how it captured their own teenage disquiet.

It's impossible and undesirable to question these reactions, to undermine them with haughty disregard. But it's also not unreasonable to ask how these players could have been so easily satisfied. For readers of contemporary fiction or even viewers of serious television, it's hard for me to imagine that Gone Home would elicit much of any reaction, let alone the reports of full-bore weeping and breathless panegyrics this game has enjoyed. I felt charmed upon completing Gone Home, but then I felt ashamed for failing to meet the emotional bar set by my videogame-playing brethren.

Compared to classic and contemporary works of literature on the challenges and implications of queer love (Virginia Woolf's Orlando, or Lillian Hellman's "The Children's Hour," or Pamela Moore's Chocolates for Breakfast or Alice Walker's The Color Purple, or Bertha Harris's Lover, or Rita Mae Brown's Rubyfruit Jungle, to name but a few of the most obvious candidates) Gone Home would seem amateurish, forced, heavy-handed. Even Gary D. Wilson's "Sweet Sixteen," a five-hundred word microfiction about teenage love and its midlife aftermath, makes Gone Home feel trite and boilerplate. For a literary audience,Gone Home will certainly be more appealing than Bioshock—but less appealing than, say, Jeanette Winterson's Oranges are Not the Only Fruit, a book Bioshock players have no more heard of than readers of Winterson have heard of Ken Levine.
games  criticism  from instapaper
december 2013 by ayjay
Dwarf Fortress, SimCity's Evil Twin : The New Yorker
More fundamental than this, though, is the very particular worldview that animates all the SimCity games. The world Wright gives his players is one defined by a constant flickering interplay between progress and equilibrium, a gentle utopia of possibility. Decay is never a real threat. His cities never die, and if left to their own devices they pretty much go on as they were. The closest thing to failure is a genial sort of rut, an inability to make the city grow and progress the way you’d like; excepting perhaps the aftermath of a nuclear power plant melting down, there’s never an irreversible collapse. Without extreme, juvenile levels of incompetence, you can’t fail to make or maintain a city, you merely fail to make that city great. It’s a commonplace that many urban planners found their vocation in childhood games of SimCity—and this at least rings true, for the game is nothing if not inspirational. Its world is infinitely soothing, its consistent message one of safety, surmountable challenge, hope, and stability.

The appeal of such fictional peace does have its limits, as it turns out. One can begin to suspect that all thriving cities look pretty much the same, that even the most successful equilibrium is simply boring. The popularity of “disasters”—the calamities, ranging from fires and airplane crashes to, in the more baroque later versions, locust swarms and U.F.O. attacks, that the player can purposely inflict on his city, or allow to occur randomly—bespeaks this creeping boredom. But it points as well to a desire to demonstrate the strength and elasticity of the world’s stability. These disasters are designed to be manageable. There is a never an unfixable problem, never a ruin that can’t be cleared and rebuilt. It is an almost comically American vision, a pure product of the Reagan dream: zero history, infinite future.
games  city  from instapaper
april 2013 by ayjay
The Curse of Cow Clicker: How a Cheeky Satire Became a Videogame Hit | Wired Magazine | Wired.com
On one level, this was all part of the act. Bogost was inhabiting the persona of a manipulative game designer, and therefore it made sense to pull every dirty trick he could to make the game as sticky and addictive as possible. But as he grew into the role, he got a genuine thrill from his creation’s popularity. Instead of addressing a few hundred participants at a conference, he was sharing his perspective with tens of thousands of players, many of whom checked in several times a day. Furthermore, every time he made the game better, he received some positive bit of feedback—more players, a nice review, a funny comment on his Facebook page. Tweaking the game was almost like a game itself: Finish a task, receive a reward.

The number of players peaked at 56,000 in October before beginning a long slide down to 10,000. What Cow Clicker lost in numbers, however, it gained in fan fervency. The people who remained may have begun playing in cheeky protest, but they soon began taking it surprisingly seriously. “There is a fair amount of strategy—maybe more than Ian intended,” says Kevin Almeroth, a computer science professor at UC Santa Barbara who climbed to the top of the leaderboard, earning a golden cowbell in the process. “You have to get the top clickers in your pasture and lure them away from somebody else. I actually started to understand the psychology of Survivor a bit better.” One player wrote an online strategy guide, which included chapters like “Advanced Pasture” and “Harvesting Strategies.” In November 2010, hackers discovered the game and set up fake Facebook accounts and scripts to maximize their clicks. At first, Bogost let the cheaters prosper, but outrage from the player community eventually overwhelmed his resolve, and he added a verification system to crack down on the counterfeit clicking.
DH  tech  games  from instapaper
march 2013 by ayjay

Copy this bookmark: