With the growing number of turnkey web services that allow you to automate so many common processes, the creation of alternate reality games (ARGs) with multi-platform storytelling continues to grow. However, it isn’t the ability to fake phone calls and manipulate images on the fly that make these games work but, rather, their ability to understand the people who play them. While rising fears of artificial intelligence and powerful technology compound, it’s easy to forget that social engineering has become the preferred method of hacking. One ARG, called Subtext, demonstrates through technology that we’re our greatest vulnerability.
Subtext is a game that you play with your real life. You use your phone number to text a guy trapped in a facility somewhere in the United States. It seems like a straightforward tale of science-gone-too-far, but there’s more beneath the surface of the conspiracy narrative. The game has you solve a series of puzzles that require real-world information and experiences, including phone calls and emails with employees at an artificial intelligence company called Mensera. While you know you’re playing a game, the experience still manages to create doubt from time to time and leave you wondering if a human jumped into the mix during a variety of interactions.
William O’Connell, the surprisingly young developer of Subtext, accomplishes this feat by demonstrating an excellent understanding of human behavior and making use of that in the gameplay. ARGs already set the stage for you to believe they could be real because they can emulate common human behavior with widespread technology. Instead of watching a conversation, you can participate in one and it’s that participation that urges you to accept the game as part of your actual reality. They create a living narrative that primes you to accept a computer process as human in those moments that appear more real than you expected because they happen in a way you struggle to immediately explain.
Subtext exploits every opportunity to make you question with whom—or what—you’re communicating and does so at an ideal rhythm. It sets up situations with predictable outcomes and knows how to react to each one. It’s a game that, at times, feels more like the artificial intelligence we assume Google’s hiding in a secret underground facility than the frustratingly inconsistent experiences delivered by something more common like Siri or Alexa. Subtext, however, isn’t artificially intelligent at all. It’s organically intelligent because its creator understands how to hack the player’s mind more successfully than any modern technology.
You don’t have to look too far to see how easily we’ll accept an automated process for a human when the person who created it understands the necessary ingredients for a specific sliver of reality. Lenny (seen above) is one of William’s favorite examples because it excels at convincing telemarketers it’s an old man. Subtext accomplishes the similar goal of socially engineering an experience that convinces players of its reality. Like Lenny, the game creates a series of circumstances that make your behavior far more predictable and then uses that to make you do what it wants.
In a time where social engineering is the malicious hacking method of choice, it’s to your benefit to experience that mental magic in a game rather than real life. Subtext is fun, but it also serves as a great demonstration of how people can use technology to hack you more easily than your machines. If you want to check it out, purchase the game on Itch for $7—which mostly serves to cover the operational cost. The experience unfolds over about three days and takes 2-3 hours to complete. If you just want to get your feet wet, you also try a meta-demo for free.
Now read:
- Mozilla Pulls Mr. Robot Add-On From Firefox Amid Blowback
- Serious Games Tackles ARG Design and the Coming Hive Mind
- A New Crop of Video Games Invade Real Life via Augmented Reality
from ExtremeTechExtremeTech https://ift.tt/2njg7P4
ليست هناك تعليقات:
إرسال تعليق