The BBC's Szu Ping Chan takes a look at the futuristic technology depicted in 1982's Blade Runner, which was set in November 2019. It now being November 2019, how did it do? We're doing great on telecommunications and despoiling the planet, but not well on the genetically-engineered vat-grown human clones front.
computational photography is becoming the norm, helping our phones take incredible low-light pictures, and automatically blur the background of our portrait shots. But the Esper machine, which Deckard uses to find clues by zooming in on different things within photos, remains ahead of its time. It enables him to see objects and people from different angles, and items which were not previously visible. AI researchers are working on software that can create interactive 3D views from a single 2D source image, but it's likely to be many more years to come before Photoshop gets the feature.
How might Deckard's camera work, practically? The data could only represent what the camera can see at the moment of capture. Recent light-field cameras (with several lenses at different focal lengths) can do the Blade Runner trick, but not enough to offer the shift of perpsective Deckard was able to explore on his single-purpose photo printer.
Perhaps his camera spits out out little drones with supplemental cameras, snapping simultaneously from nearby points of view and baking all the data into the original. Or perhaps being a Blade Runner, he has access to encrypted information in the print captured from nearby surveillance cameras.
Or maybe it's pointless speculating how a photo could have gigapixel resolution and the light field of a foot-wide capture surface, yet still look exactly like a polaroid and require a machine the size and grace of a photocopier to examine.