Following the recent initiatives for democratization of AI, generative models become increasingly popular and accessible. The widespread use of generative adversarial networks (GAN) is positively impacting some industries like entertainment, however, they are also used with malicious intent. Believing the fake video of a politician, distributing fake pornographic content of celebrities, fabricating impersonated fake videos as evidence in courts are just a few real world consequences of deep fakes. This lack of authenticity and increasing information obfuscation pose real threats to individuals, the criminal system, and information integrity. As every technology is simultaneously built with the counter technology to neutralize its negative effects, we believe that it is the perfect time to develop a deep fake detector. Deep fakes depend on photorealism to disable our natural detectors: we cannot simply look at a video to decide if it is real. On the other hand, this realism is not preserved in physiological signals of deep fakes, yet. We present novel approaches to detect synthetic content in portrait videos, as a preventive solution for the emerging threat of deep fakes. We observe that, detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce formidably realistic results. Our key assertion follows that biological signals hidden in portrait videos can be used as an implicit descriptor of authenticity, because they are neither spatially nor temporally preserved in fake content. We exhaustively analyze with traditional machine learning approaches and deep learning methods; the signals extracted from heart beats, PPG signals, eye vergence, and gaze movement of deep fake actors to create robust and accurate deep fake detectors. Moreover, we trace the source of deep fakes by exploiting their heart beats via residuals of different generative models. Achieving leading results over existing datasets, and our in-the-wild dataset, justifies our observations and pioneers a new dimension in deep fake research.