The result of a fruitful collaboration between ISPL and Anderson Rocha‘s group at Universidade Estadual de Campinas is now IEEE Access “Article of the Week”. If Interested in open-set camera model identification, go check it out!  P. R. M. Júnior, L. Bondi, P. Bestagini,
ISPL joined the DARPA’s Semantic Forensics (SemaFor) research program as part of the Purdue University team. We will develop techniques to detect and attribute falsified media like text, audio, image and video.
DeepFake techniques enable realistic AI-generated videos of people doing and saying fictional things. These videos can be used maliciously as a source of misinformation, harassment and persuasion, thus badly impacting on the society. ISPL is actively working to fight this
PREserving Media trustworthiness in the artificial Intelligence ERa ISPL is involved in PREMIER project, funded by the Italian Ministry of Education, University, and Research. The objective of PREMIER is to devise new techniques capable of distinguishing fake from original videos.
Paolo Bestagini participated at the ‘Deepfakes – Prepare Now’ debate in Brazil about the implications of video manipulation using artificial intelligence organized by Witness. More info here.
David Güera (Purdue University) presents the video manipulation detection method developed in partnership with ISPL . This work was accepted as contribution to the workshop “Deep Learning for Detecting AudioVisual Fakes” at ICML 2019.  D. Güera, S. Baireddy, P.
The IEEE International Workshop on Information Forensics and Security (WIFS) is a unique annual event organised by the IEEE Information Forensics and Security (IFS) Technical Committee of the IEEE Signal Processing Society. It is a major forum that brings researchers
Paolo Bestagini gave a talk on video forensics at the Institute of Computing (IC) of the State University of Campinas (UNICAMP) invited by Prof. Anderson Rocha. Abstract: The recent development of multimedia devices and editing tools, together with the proliferation of video