Free and easy to use machine learning face swapping software for videos is here.
A good article on BBC News shows the technology and how it is primarily being used. While there are plenty of joke videos with Nicolas Cage replacing famous actors and various political face swaps, the main use so far has been superimposing Hollywood actress's faces on pornographic videos. These so called "deepfakes" are blowing up in popularity right now.
Ethically I view fake porn videos as a crass violation of the actresses whose likenesses are being used and I doubt there is going to be a lot of debate on that. But I think there are a lot more ramifications of this technology for the video surveillance industry. Will the ability of software to easily manipulate video cause a technology arms race as we try to watermark or otherwise thwart this technology? Will it influence future jurors who are viewing video evidence in a trial?
Also if you want to see how the software works for yourself, it is easy to find by searching for "FakeApp" in google. Be warned that the download page and instructions are on a Reddit.com forum dedicated to "deepfake" video creation, which contains NSFW images.