Computer Scientists Are Developing A New Tool That Will Generate Videos From Themed Texts

In a world of inexperienced photographers and videographers gathering a flood of content through their smartphones and handheld devices, there is a need for a smart, easy-to-use tool to automate the creation of movies and video montages. To date, many high quality videos still rely on professional frame-based editing tools to process raw material and produce a coherent video with a captivating storyline. Computer Scientists

A global team of computer scientists from Tsinghua and Beihang Universities in China, Harvard University in the US and IDC Herzliya in Israel has developed "Write-A-Video", a new tool that generates video from thematic texts. Using word and text editing, the tool automatically determines which scenes or recordings from a repository are selected to illustrate the desired action. The tool allows inexperienced users to create high-quality video montages in a simple and easy-to-use manner, without the need for professional video production and editing capabilities. Computer Scientists

The team will present its work at the ACM SIGGRAPH Asia from November 17th to 20th in Brisbane, Australia. Already in its twelfth year, SIGGRAPH Asia attracts the most respected technical and creative people from around the world to deal with computer graphics, animation, interactivity, games and new technologies. Computer Scientists

While existing video editing tools require video editing skills, newcomers can use the new method to create videos that tell stories in a more natural way. Write-A-Video, the researchers say, allows users to create a video montage by simply editing the text associated with the video. For example, adding or deleting text and moving sentences are converted into video editing operations, such as video editing. B. finding appropriate shots, cutting and rearranging shots, and creating a final video montage result. Computer Scientists

"Write-A-Video takes advantage of the advances in automatic video understanding and a unique user interface to enable a more natural and easier video production," said Professor Ariel Shamir, Dean of the Efi Arazi School of Computer Science at IDC Herzliya. "With our tool, the user provides input primarily in the form of text editing, and the tool automatically searches for semantically-appropriate candidate recordings from a video repository, and then uses an optimization method to assemble the video montage by automatically cropping and rearranging the recordings. "

"With Write-A-Video, users can also explore visual styles for each scene using cinematic phrases that produce, for example, faster or slower movies, fewer or more content moves, and so on." explains Dr. Miao Wang from Beihang University. Computer Scientists

When choosing candidate recordings from the video repository, the method also considers the aesthetic appeal of the recordings. It selects those that are ideally lit, well focused and not blurred or unstable. "The user can render the movie at any time and preview the video montage result with an accompanying voice-over comment." says Professor Shi-Min Hu from Tsinghua University.

The team's research shows that intelligent digital tools that combine the capabilities of humans and algorithms can help users in the creative process. "Our work demonstrates the potential of automated visual-semantic matching in idiom-based computational editing and provides a smart way to make amateur video creation more accessible," says Shamir.

For the study, the approach was tested on various thematic text and video repositories with quantitative evaluation and user studies. Users with no experience in video editing could produce satisfactory video with the Write-A-Video tool, sometimes faster than professionals using frame-based editing software. At SIGGRAPH Asia, the team will demonstrate the Write-A-Video application and introduce a variety of examples of text-to-video productions.

The team consists of Miao Wang (National Key Technology and Virtual Reality Technology / Beihang University and Tsinghua University); Guo-Wei Yang (BNRist / Tsinghua University); Shi-Min Hu (BNRist / Tsinghua University); Shing-Tung Yau (Harvard) and Ariel Shamir (IDC Herzliya, Israel). Computer Scientists



Story Source:
Materials provided by Association for Computing Machinery. Note: Content may be edited for style and length.

News Source