Remote meetings

Teleconference room, five minutes

Word processors! Digital calendars! An “electronic mail system whereby messages are sent and received by computer”! A 1982 New York Times article on the “office of the future” covered all these exciting possibilities, plus one more: the remote meeting.

“Teleconferencing,” as it was described, would allow people in far-flung locations to see and speak to one another. Even in its nascent state, the prospect was met with some ambivalence.

“So far, teleconferencing has been used mainly as a substitute for routine weekly meetings among people who have already met face to face,” the article read. “But according to Robert Johansen, a senior research fellow at the Institute for the Future in Menlo Park, Calif., ‘the main market for teleconferencing is for meetings that don’t occur now among people who are geographically separated. It creates new communications opportunities.’”

After a surprisingly long wait, that future is here. Remote meeting software has fostered professional collaborations that could never have existed in a previous era, and allowed partnerships around the globe to thrive. But they are also glitchy, time-consuming, and frequently infuriating. Let’s dial in.

BRIEF HISTORY

1878: George du Maurier publishes a cartoon in Punch magazine of the “telephonoscope,” a fictional invention from Thomas Edison that transmits images along with sound.

1964: Bell Labs debuts the Picturephone, which requires that both parties make advance reservations, travel to one of the nation’s few designated Picturephone booths, and remain motionless for the duration of the call.

1993: The first “webcam” makes its public debut; it’s used to monitor a coffee pot at the University of Cambridge.

1994: The QuickCam is the first consumer webcam; it costs $100 ($175 today).

1995: The first public videoconference between North America and Africa takes place at an event called the Cybersafari Digital Be-In and Internet Love-Fest, connecting a San Francisco tech conference and a Cape Town “tech rave” for a one-hour video chat.

1996: College student Jennifer Ringley debuts Jennicam and becomes the first livestream internet star.

2006: Skype (short for “sky peer to peer”) adds videoconferencing.

2010: FaceTime emerges from an Apple gaming social network.

2013: British adventurer Daniel Hughes speaks to the BBC via Skype from the top of Mount Everest.

In 1936 the first videoconference was transmitted at a robust 25 frames per second—better than the first consumer webcam 60 years later—between Berlin and Leipzig, with the quality of “a small size projection of a substandard cinema film.” It cost about a week’s wages for a three-minute call. Over the next four years Hamburg and Munich were added to the system, as well. Each outpost handled about 20 calls a day, but World War II shut it down.

Progress then ground to a halt. In the year 2000, when the future was supposed to have long since arrived, a historian told The New York Times that videoconferencing had been “on the verge of happening for nearly a hundred years.” AT&T started work on video phones in the 1930s, but didn’t try to go into homes and businesses until 1970 with a device that was, the company’s historian told the Times, “the most famous failure in the history of the Bell system.” 

Execs told AT&T they’d embrace the Picturephone at $50 a month($340 in 2019 dollars). It rolled out at $70 a month in Chicago ($475 today) and $160 in Pittsburgh (over $1,000), plus overages after half an hour. It was a dud, despite a forward-thinking feature that allowed users to look at documents in a company database. AT&T tried again in 1992 with the Videophone 2500, but it cost $1,600 to see your loved ones on a 3.3-inch screen. They weren’t alone: MCI made a $750 video phone that made callers look “as if they are slowly moving underwater.”

AT&T concluded in 2001 that the Picturephone was “too big, too expensive, and uncomfortably intrusive,” and also that “people did not want to be seen on the telephone.” Eventually, tiny and cheap cameras would become ubiquitous, which in turn would make them seem less intrusive, aiding their widespread adoption as a business tool.