While the revision of past events and fictionalization of historical figures can be entertaining fodder for movie theater audiences, the manipulation of video and/or audio to the end of “faking” statements or entire events can have adverse and potentially serious consequences. These potential threats and the efforts being made by DARPA and others to detect and avert them have gained significant attention by the news media, including a recent CNN feature highlighting performers in the DARPA “MediFor” program. Short for “Media Forensics”, the program has several research teams focusing on the detection of still image and video manipulations, as well as audiovisual inconsistencies, and developing tools that enable determinations of how alterations were accomplished. The technology used to alter videos is relatively affordable, easily accessible, and improving in quality, and this makes detecting deepfake media an ever-increasing challenge. It the goal of programs such as MediFor to develop algorithms that can more easily identify deepfake media, and its success will be crucial for the public to be able to trust what they hear on the radio and see on their screens for years to come.