Friday, July 15, 2011

postheadericon Does Not Compute: 10 PC Myths from Movies and Television

A computer will blow up if it can not answer a question.

According to Hollywood, computers are so delicate that, when faced with a question they are faced not answer, they will explode. Nobody in the history of the film knew that better than William Shatner. During his run as Captain James T. Kirk, the more malicious computer Shat, androids and evil AIs with a series of contradictory orders, illogical paradoxes and strings took questions on love or the human condition, than you can shake a bat'leth at.

If computers really that volatile, you would not be able to have the number of people who come into an early grave over disc-bound-and Web-enabled Microsoft Encarta iterations with bupkis back in the day was sent to count . The same is true for tungsten Alpha: I do not recall any mention of the dangers of posing a difficult question to their servers. In reality, computers do not explode when they can not answer your question or problem to solve. The worst thing that could happen is that your system could start to freeze, or pony with a Blue Screen of Death. Admittedly, in the case of the latter could, many users prefer to see an explosion, but unfortunately it is not just happen.

Voice recognition software works every time - and without error to this.

While speech recognition software is leaps and bounds over the last decade improved, it still kind of sucks. Due to the many nuances of human language as diverse dialects, tone, and in some cases, language disorders, many people are unable to dictate an e-mail in Outlook, not to verbally control computer with something such as precision and reliability.

Except, of course, in Hollywood. In 2001: A Space Odyssey, is HAL, the pod bay doors Dave open bidding, Will Smith is able to carry on a meaningful conversation with VIKI in I, Robot and Blade Runner, Deckard is in a position to his computer at home to manipulate directly to a crime scene photo with nothing more than a few words. Riddle us this: When was the last time muttering orders into a microphone for GIMP or Photoshop to reduce your holiday photos? Exactly. While modern supercomputers like IBM's Watson have the power to voice commands with uncanny accuracy, consumer hardware of the kind used to read this are just hacking with it can not the way to convince Hollywood to us, it may want to process. It's too bad. We have to bet on all types of patients.

Any image or video can be corrected, blown up and made crystal clear.

Speaking of Decker futzing with photographs in Blade Runner, why Hollywood obsessed with unrealistic representations of image manipulation? No matter how grainy a photo could be, how dark it was outside, when a picture was taken, or how far away the object was a photographer, can any image be zoomed, expanded, and to use at court or the track of the dolled evil on the road.

Jim True-Frost Roland Pryzbylewski it does with video in The Wire, and Bryan Brown gets his picture on the back of handicrafts in 1986 with FX. CSI? We have not even begun. The truth of the matter is that no matter how advanced the software, or how strong you are cooking a rig, the extent of how well an image can be read - and how big you can blow it for viewing without your eyes bleed - is highly dependent on the quality of the original image that you are working with. In other words, if it is an image with a Cyber-Shot D710, no amount of zoom and improve gonna let something like in this shot as it was baked with a Sony a900.


If Hollywood was going, she would have us believe that with the exception of a few terrorists and Code Monkeys, the majority of Earth's population are computer-wielding slings MacBooks, iMacs and MacBook Pros. The fact that you're reading this here and not laze around with the lovable freaks over at Mac | Life now goes a long way toward proving that this myth is nothing but a bunch of bunk. You know what else much to illustrate our point of view? Numbers: According to the held market research firm IDC, Apple only 10.7% of the personal computer market in North America during its second fiscal quarter of 2011 by behind HP and Dell. This is far removed from what shown on both the big and small screens.

Of the few people who are not used by anyone with a Mac or Windows Microsoft Office product. Instead they are using some custom GUI with 72-point font.

If Hollywood has really banished all computer users, even the 10%, consisting of terrorists, dodgy internet cafes and backwater police with anything other than Apple hardware have-a severe visual impairment, which she at a ridiculously large font at all to forces use periods. While it is probably one of the best programs in the history of television, The Wire is bad for this: Over six seasons, Lester is sitting two feet away from monitors his computer, but insists, blasted his eyeballs from their sockets with an eye absolutely massive font. Moreover, if a film does not have Apple hardware, but do not even have a PC with a Microsoft software, such as the characters all seem to prefer using a specially developed, but thoroughly unintuitive, Graphic User Interface.

While it'd be easy to say that film and television producers have a hate on for the most popular operating system on the planet, there's a less extreme answer to be had here: While most of us can type up an email using a 14 point font without any discomfort, small font sizes are wicked hard to read on the big screen and television, and could leave viewers missing an important visual cue that was meant to drive the show's plot forward.

Do you have a myth that we missed? Add to our list in the comments!

0 comments:

Blog Archive