Debunking myths about computers and the internet - Things to know for when Skynet takes over

Putting the science in fiction - Dan Koboldt, Chuck Wendig 2018

Debunking myths about computers and the internet
Things to know for when Skynet takes over

By Matt Perkins

Here’s a hard truth for anyone writing science fiction, fantasy, or any other form of speculative fiction: A big part of your audience knows a lot more about computers than you do. If you choose to feature computing or the Internet in your work, you run a big risk of stretching credibility for these readers who are, sadly, accustomed to disappointment from portrayals of technology in fiction. Missteps like these take readers out of the story and hurt their suspension of disbelief. With a large percentage of today’s populace working in IT, these tech-savvy readers are too numerous to ignore.

Though reality isn’t always as exciting as the bad assumptions we’ve grown accustomed to, there are still plenty of opportunities for excitement and conflict in the digital world that you can use to your advantage. Your techie readers will certainly appreciate your attention to detail, and your respect for the reality of modern computing will make your story stand out in a crowd of tech ignorance.

Myth #1: The know-it-all computer geek

You probably wouldn’t expect Maytag to make a good car, nor would you expect Volkswagen to make a good dishwasher. And I bet you wouldn’t want a psychiatrist doing your liver transplant. The world of computing is as highly specialized as engineering or medicine, if not more so. This is why the trope of the computer geek character who effortlessly masters every piece of technology in existence is so irritating: It’s impossible for one person to be that knowledgeable. Computer tech as a whole is, quite literally, beyond the understanding of any one person.

Real IT pros tend to specialize in one small subset of computing, gaining deep knowledge and experience in their chosen discipline. When they encounter a problem outside their area of expertise, they consult with a specialist in that domain, much like a doctor would. Large companies employ dozens, if not hundreds, of diverse specialists, while small organizations typically outsource their most demanding IT tasks to firms that provide these services. Nobody hires an IT generalist—even if such a person existed, the “jack of all trades, master of none” principle would apply.

In any case, having one character who can solve every tech problem imaginable is lazy writing. That doesn’t mean your computer geek can’t try to solve a given problem—there’s nothing more enticing than a challenging puzzle—but you must put realistic limits on his abilities. There will be certain things he won’t be able to do, no matter how smart he is.

For your reference, some (definitely not all) of the most common areas of IT specialization are:

· SOFTWARE DEVELOPER/ENGINEERS: create and update the software that runs on a computer.

· OS DEVELOPERS: a subspecialty of the above; work on the underlying operating system of the computer (e.g., Windows, Android).

· DATABASE ADMINISTRATORS: build and maintain databases, big and small; database security is typically part of their job.

· NETWORK ADMINISTRATORS: build and maintain computer networks and keep everything connected; they often handle network security as well.

· HARDWARE ENGINEERS: design, build, or maintain the hardware components of computers and computer accessories; as you can probably guess, this profession has a lot of subspecialties.

· TECHNICAL SUPPORT: the unsung heroes who pick up the phone when someone has a problem with their computer; good communication and plenty of patience are the hallmarks of this role; they fix basic issues on their own and triage the more complex problems to other IT people.

Myth #2: Quick and easy hacking

We’ve all seen this one: A hacker sits in front of a hostile computer (or even a keypad on a locked door), types a magical sequence of characters/digits, then smiles and says those two thrilling words: “I’m in.” The door opens, the files are downloaded, the missile launch is aborted, and the protagonists have succeeded again, all thanks to the mighty hacker and her arcane computer knowledge.

This is patently absurd, especially the door-and-keypad scenario. Please don’t ever write anything like this. It’s lazy and unrealistic, and few computer tropes will cause more eye rolls than this one.

Want to know the “magic” secret behind most real-world hacks? Someone was careless with his password, and a hacker got a hold of it. The Sony Pictures hack of 2014 is a perfect example of this: A system administrator’s password was used to gain full access to Sony Pictures’ network. There’s no magic, no secret “open sesame” codes—just stolen network credentials. A hack like this is no more sophisticated than a pickpocket.

So, how do hackers obtain passwords? There are plenty of possibilities. The most common is basic negligence on the part of the account holder. At a previous job, my colleagues and I would always joke that if you flipped over ten of our clients’ keyboards, you’d find nine passwords. Worse, a lot of clueless users have passwords that are very easy to guess, like password or letmein (side note: for your own sake, please come up with a password that’s long and complex, but easy for you to remember). Then there’s phishing: a technique where a hacker impersonates a trusted party in order to steal login credentials; it is increasingly common and alarmingly successful. In some cases, a software vulnerability in weak or out-of-date software can expose passwords to someone who knows where to look; this is exactly how the Heartbleed bug in OpenSSL worked. And, of course, there’s the old-fashioned way: You can give the system administrator a briefcase full of cash or point a gun at his head. There’s a saying in information security: Your network is only as secure as your most vulnerable IT employee.

One possibility is to have the hack be an inside job, which is alarmingly common in the real world. In my Winterwakers series, the hacker protagonist is a network support tech whose job grants him legitimate access to plenty of computer networks.

Myth #3: The one and only copy

“I have the last remaining copy of your mom’s secret banana pancake recipe,” cackled the villain. “Give me the nuclear launch codes, or I delete it forever!”

This one always makes me laugh. Data is everywhere, and it’s virtually immortal. This is true now more than ever, with real-time backups and cloud drives being the new normal. Companies and governments know their data is precious and treat it accordingly, employing frequent backups and redundant storage. Even private individuals today have access to strong data protection via home backup drives or with cloud storage services like Dropbox or iCloud. If something ever gets erased, either deliberately or accidentally, it’s a trivial matter to bring it back from the dead. Conversely, finding and erasing all traces of a file from all these backups and redundant systems is a complex, laborious task.

Assuming none of your backups or clouds have you covered, there are still ways you can retrieve deleted files. Consumer apps exist that find deleted data and reconstitute it, often with perfect success. Data recovery pros have access to even more advanced tools and can sometimes recover files from fire or flood-damaged computers.

If you want data to be lost or inaccessible to your characters, consider encryption instead of deletion. A properly encrypted file is nearly impossible to crack without the decryption key (the rules in “Quick and Easy Hacking” apply here as well).

Myth #4: Enhance from nothing

This one has been used and abused countless times in police procedurals and science fiction thrillers alike. Using what appears to be simple computer software, a tech transforms a grainy, indistinct image into a crisp, sharp rendering of a license plate, an address, or a person’s face. Fictional techs can zoom in and enhance virtually any image the hard-nosed detective throws at them.

Reality is nowhere near this convenient. A computer image is stored as a grid of pixels: tiny colored squares that combine to create a picture. When you zoom in on an image, all the computer can do is make those pixels larger, which just looks like a blockier version of the same image. The software can’t add more pixels—it has no way of knowing what’s supposed to be there after the image was created.

That said, if you had a massive, hi-res image, it might in fact be possible to zoom in on it and get better detail, simply because of the sheer number of pixels available. Beware of relying on this possibility, though. A YouTube-quality (i.e., not that great) video weighs in at around 3 GB per hour of footage. Higher resolution uncompressed video files are hundreds of times that size. If your villain’s secret compound has dozens of security cameras (as all good secret compounds should), the data storage demands of hi-def video would be cost-prohibitive, if not massively taxing on the hardware. This is why security camera footage is so grainy: If it weren’t, you’d only be able to store a few minutes at a time.

Instead of zoom-and-enhance, real police departments use quantity over quality. Cameras are everywhere nowadays, and most positive IDs are made by examining and comparing images from multiple cameras, usually without the aid of software. Learn from the real world and don’t lazily rely on this well-worn trope: Make your characters work hard to crack that case. Computers can do many amazing things, but magically enhancing an image to move your plot along isn’t one of them.