Some events I will always remember because they were both significant and industry changing: the Windows 95 launch, the initial iPod launch event, my first use of Microsoft HoloLens, the launch of AMD’s Threadripper (I just love that name) and this week the launch of the NVIDIA RTX platform.
In some ways this RTX workstation graphics launch could eventually eclipse the others because it sets a foundation for changing reality and dramatically alters our perception of the world around us.
Whether we are talking CGI or just Photoshop, the ability to create real-looking images that are based mostly on imagination quickly and inexpensively has been kind of a Holy Grail of imaging. Animators and graphic artists now massively outnumber actors in many major movies, and it is often amazing to watch the credits to see the huge number of folks focused on reconstructing reality to make the movie work.
One enormous challenge in this effort has been light. The way light bounces off objects, creates shadows and alters the appearance of an object is what often tells our brain that the object is real and not artificial. Handling the complexity of light refraction has been a bit of a nightmare for those trying to create these realistic images.
Way back in 1979, a guy by the name of Turner Whitted created a concept called multi-bounce recursive ray tracing. The problem was it was massively resource intensive. He was able to create a low-resolution 512×512 image using a $1.4M midrange computer and 1.2 hours of computer time. Yes, it proved the concept, but the picture’s quality and cost weren’t acceptable at any reasonable production scale. It was believed that it would take one Cray Supercomputer for every pixel in an image to provide a truly realistic real-time image.
But technology changed, and a few months ago NVIDIA demonstrated it could rapidly create photo-realistic images using its new DGX Station (which costs around $80K) s in a reasonable amount of time. This week NVIDIA announced its RTX video card line costing between $2.3 and $10K (depending on version) which could do the job far more quickly.
Many of us thought this level of performance was at least a decade out. This launch is potentially extremely disruptive for the workstation industry focused on rendering.
With Intel architecture, annual performance improvement, at least for the last decade or so, has generally been under 10 percent. Users don’t really notice performance improvements under 20 percent, so that typically means workstations, which often have a direct connection (in a rendering environment) to the related firm’s bottom line, get updated on a two- to three-year cadence. These aren’t inexpensive products, but they have such a huge impact on productivity that once you get higher than 20 percent improvement you can justify the replacement.
In the case of RTX, we are talking a 6x performance improvement (according to NVIDIA) over the prior generation of NVIDIA professional graphics cards. This is an unprecedented performance improvement, and I’m wondering what one of these cards would do on top of AMD’s Threadripper CPU platform for maximum impact.
One of the fascinating, and I think underplayed, parts of this new card line is an artificial intelligence (AI) component which can be trained to up-convert images. They first render in low resolution and then the AI takes over and converts the image in real time to 4K or 8K. This conversion capability can be applied to most any low-resolution image. The AI learns how to interpolate the needed extra pixels and then reimages the picture or frame to create a far higher resolution result. Interestingly, it can do the same thing with frames in a movie to take a regular speed GoPro-like video file and convert it into the kind of high-speed video file that would typically require a $140K high-speed camera.
This all could be applied to old sports footage that was taken before high-speed cameras were available, or to old TV shows and movies to bring them up to current standards and turn long-languishing video libraries into viable content for movie services like Netflix and Amazon Prime. This alone is a multi-billion-dollar opportunity.
Another thing I expect to become affordable is the ability to put your own family members into certain movies. You could give your son a Robin Hood movie with his face on Robin Hood, for instance, and it would be photorealistic. Or your daughter could be the face of Tomb Raider, and she could be cut into scenes in video games. Speaking of video games, old titles could be far more easily updated for current graphics. Granted, game play would be unchanged, but the screen image would be vastly more pleasing on current TVs and monitors.
This powerful ability to, relatively inexpensively, create and modify images in photo-realistic ways will, I believe, fundamentally change the TV and movie industry. We should see increased interest in old shows and movies as they are updated to new digital standards and movies created from scratch which are both less expensive to create and more realistic to watch. Of course, it won’t fix issues with the scripts and editing (two areas that could also use some AI help), but the quality and amount of video content should increase substantially as a result of this.
For those focused on editing or creating digital visual content, the RTX line from NVIDIA Is a game changer. And it is really only the tip of the iceberg as this is the first generation. Makes you wonder what generation number three will be like, doesn’t it?
Photo courtesy of Shutterstock.
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.