RunwayML and Pika Labs announce new features for their video AI systems.
RunwayML gets camera control for AI-generated video. This can be used, for example, to selectively zoom in and out or influence the horizontal and vertical direction of camera movement.
Until now, these elements were largely random. Runway CEO Cristóbal Valenzuela introduced the new feature on Twitter. It is currently being rolled out in the Runway Gen-2 web application.
In late August, RunwayML Gen-2 added a “motion slider” to control the amount of motion in an image animation (slow to fast). The two tools together provide more control over video generation.
Pika Labs gets image animation and 24 frames per second
Runway’s competitor Pika Labs also released an image animation feature in early September. With this feature, you can animate an image created with an AI image tool like Midjourney. The following trailer was created this way. A video editing program was used to stitch together the Pika video clips and add sound.
In addition, video can now be created at up to 24 frames per second, which is the typical frame rate for movies. Previously, the maximum frame rate was 8 frames per second. With the new feature, the frame rate can be freely set between 8 and 24 frames per second. The video below shows the difference.
Pika Labs joins RunwayML
Like Midjourney, Pika Labs is hosted on Discord. There, the AI video service has gained about 160,000 users in the last two months. Pika Labs was founded in late April by Stanford students Demi Guoa and Chenlin Meng.
Pika Labs is backed by former Github CEO Nat Friedman, among others, according to The Information. He and several other investors have invested $15 million in Pika Labs since its founding. Its competitor, Runway, is valued at $1.5 billion and has raised $236.5 million from investors since its founding in 2018.
Friedman reportedly lured Pika Labs with computing power for AI training, among other things. Recently, he and his partner Daniel Gross bought 2512 Nvidia H100 cards for $100 million.