
These simple prompt tweaks made my AI videos look way better — try them yourself
As chatbots continue to get smarter and gain the ability to do more, AI video generators are evolving just as fast. And while many are still in beta or invite-only, one thing is clear: prompt quality makes a huge difference.After testing several platforms (including Google Veo 3, the premium video generator currently available for $249/month through Google Ultra) I've learned that the way you write your prompts completely changes the results.
Here's how I leveled up my AI videos — and the tricks you can use to make your own clips look sharper, more cinematic and more coherent.To keep things fair, all prompts were tested using Veo 2 — Google's lower-tier video generator that doesn't include audio. To try it yourself, you'll need to join the waitlist for Google Labs' VideoFX tool, where Veo 2 is currently available.
You can also access Veo 3 through Google AI Studio or Canva Pro. These prompts can be used on any of the best AI video generators including Midjourney, Firefly and Sora.
A common mistake many users make is starting with adjectives like, 'a stunning cinematic video of a beach at sunset.' Sounds nice, but vague.
Instead, I always find better success when I lead with structure: 'wide establishing shot of a beach at golden hour, camera slowly panning left.' The more you write like a director, the more polished your results will look.
Veo seems to respond well to cinematic language.
Prompts like 'low angle tracking shot of a child running through tall grass' or 'drone shot pulling away from a mountain cabin at sunrise' generated far more dynamic visuals than basic descriptions.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
If you're not familiar with filmmaking terms, think in shots: wide, medium, close-up, overhead. While I am not a camera operator, I have written enough screenplays and been on enough television and movie sets to have these memorized.
Here are a few basic filmmaking terms that I frequently use in my prompts.
Instead of asking for a whole story in one sentence, break your prompt into visual beats. For example:
Beat 1: Wide shot of a skateboarder riding downhill at sunset
Beat 2: Slow motion close-up of wheels hitting the pavement
Beat 3: Camera follows from behind as sparks fly from the board
Even if your preferred AI video generator doesn't yet support full scene transitions, this approach helps guide the system toward more intentional storytelling.
Movement is key. I started adding phrases like 'camera tilts up,' 'fog rolling in,' or 'wind rustling trees.'
These made a surprising difference in realism and cinematic feel. So did mood cues like 'soft lighting,' 'overcast skies,' or 'neon glow.'
AI video is still new, and results can be unpredictable. I ran each prompt multiple times to compare outputs, tweaking a word here and there.
Veo's consistency is noticeably better than other tools I've tried. It doesn't always nail it, but when it does, the results can look like they came from a professional video shoot. The prompts below were used with Veo 3 so they also have sound.
Landscape prompt: Wide establishing shot of a mountain range at golden hour, camera slowly panning left. Fog rolls between the peaks, soft ambient lighting, cinematic depth of field.
Sports prompt: Slow-motion shot of a skateboarder jumping off a ramp at sunset. Camera follows from low angle as dust kicks up. Warm lighting, dynamic energy.
Food reel prompt: Overhead shot of a steaming bowl of ramen being served in a Tokyo street market. Close-up of chopsticks lifting noodles, steam rising. Warm lighting, vibrant details.
Action POV prompt: First-person POV of a cyclist riding down a forest trail. Leaves fly past, sun flickers through trees. GoPro-style realism, immersive movement.
If your AI videos look generic or "like AI," your prompt may be to blame. With a few smart tweaks; focusing on structure, camera angles and specific action, you can get more polished, more cinematic results.
Google Veo is one of the most capable tools I've tested, but like any AI, it's only as good as what you feed it.
Want to try it yourself? Start small. Pick a scene, describe it like a shot list and watch what happens next. Let me know in the comments how everything came out!

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
3 hours ago
- Yahoo
2 Top Stocks Down 40% to Buy With $1,000
Reddit's data could be a valuable asset that Wall Street is overlooking. Marvell is well-positioned to benefit from data centers for AI workloads. 10 stocks we like better than Reddit › Buying shares of competitively positioned companies that are experiencing robust growth for their products can put you on the road to financial freedom. Sometimes the market gives you the opportunity to buy quality stocks at big discounts that can set you up for outstanding results. If you have $1,000 you don't need for at least five years, there are a few growth stocks that Wall Street is currently sleeping on that could deliver great returns over the next few years. Reddit (NYSE: RDDT) is a popular online platform that is built around discussion threads on an endless number of topics. Over 400 million people visit Reddit on a weekly basis. This has driven strong growth in the company's advertising revenue, which is the primary means it monetizes its platform. The stock is down 39% from its recent highs. This can be attributed to two things. First, it was due for a correction after climbing to a high price-to-sales multiple of around 25. It now trades at a lower multiple of 19. Second, Wall Street has been concerned about Alphabet's Google's launch of new artificial intelligence (AI) features in Search. Google's AI Overviews, for example, is taking content from Reddit and summarizing it in Google Search results. This could lead to less traffic going directly to Reddit's platform and limit its revenue growth prospects. However, Reddit continued to report extremely strong growth in the first quarter. Revenue grew 61% year over year, with 108 million daily active users. Advertisers continue to invest in Reddit's platform, given the high engagement from these users, not to mention that many people visiting Reddit are researching a product to buy, making it more likely they will click on an ad. All the discussions and comments across Reddit's communities are not only leading to strong advertising growth, but also opening up new growth opportunities. In fact, Reddit is starting to make a significant amount of money licensing its data to companies building AI models. Its "other" revenue grew 66% year over year in Q1, representing about 9% of its quarterly revenue. This growth in data licensing signals a competitive advantage for Reddit not fully reflected in the stock price. This makes the stock a compelling buy after the recent dip. There is substantial investment pouring into data center infrastructure (e.g., advanced chips and networking systems) to lay the groundwork for an AI-driven economy. Marvell Technology (NASDAQ: MRVL) is riding this wave, yet the stock is down 41% from its recent high, setting up a buying opportunity ahead of a potential bull run. Marvell is a leader in supplying custom chip solutions and networking products for data centers. Its data center business totaled 76% of its revenue last quarter and also, coincidentally, grew 76% year over year. The chipmaker has benefited greatly from its partnership with Amazon Web Services, the leading cloud services provider for enterprises. In late 2024, it signed a new five-year deal to supply AWS with custom AI chips and networking products, which are needed for faster data transfer in AI workloads. Marvell also has a partnership with Nvidia to integrate its chips in Nvidia's NVLink Fusion. NVLink is a game-changing product that brings together custom chip solutions from multiple suppliers on a single platform. This could spell more demand for Marvell's accelerator processing units (XPUs). These agreements with AWS and Nvidia significantly bolster Marvell's long-term prospects. The stock looks expensive, trading at high multiples of sales and earnings. But keep in mind that it is seeing margins improve from growing demand. Adjusted earnings more than doubled year over year to $0.62 in the first quarter. Wall Street analysts expect 46% annualized earnings growth over the next few years, which could support significant upside in the stock. Before you buy stock in Reddit, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Reddit wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $713,547!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $966,931!* Now, it's worth noting Stock Advisor's total average return is 1,062% — a market-crushing outperformance compared to 177% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 23, 2025 John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. John Ballard has positions in Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, and Nvidia. The Motley Fool recommends Marvell Technology. The Motley Fool has a disclosure policy. 2 Top Stocks Down 40% to Buy With $1,000 was originally published by The Motley Fool Inicia sesión para acceder a tu cartera de valores


Tom's Guide
8 hours ago
- Tom's Guide
These simple prompt tweaks made my AI videos look way better — try them yourself
As chatbots continue to get smarter and gain the ability to do more, AI video generators are evolving just as fast. And while many are still in beta or invite-only, one thing is clear: prompt quality makes a huge testing several platforms (including Google Veo 3, the premium video generator currently available for $249/month through Google Ultra) I've learned that the way you write your prompts completely changes the results. Here's how I leveled up my AI videos — and the tricks you can use to make your own clips look sharper, more cinematic and more keep things fair, all prompts were tested using Veo 2 — Google's lower-tier video generator that doesn't include audio. To try it yourself, you'll need to join the waitlist for Google Labs' VideoFX tool, where Veo 2 is currently available. You can also access Veo 3 through Google AI Studio or Canva Pro. These prompts can be used on any of the best AI video generators including Midjourney, Firefly and Sora. A common mistake many users make is starting with adjectives like, 'a stunning cinematic video of a beach at sunset.' Sounds nice, but vague. Instead, I always find better success when I lead with structure: 'wide establishing shot of a beach at golden hour, camera slowly panning left.' The more you write like a director, the more polished your results will look. Veo seems to respond well to cinematic language. Prompts like 'low angle tracking shot of a child running through tall grass' or 'drone shot pulling away from a mountain cabin at sunrise' generated far more dynamic visuals than basic descriptions. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. If you're not familiar with filmmaking terms, think in shots: wide, medium, close-up, overhead. While I am not a camera operator, I have written enough screenplays and been on enough television and movie sets to have these memorized. Here are a few basic filmmaking terms that I frequently use in my prompts. Instead of asking for a whole story in one sentence, break your prompt into visual beats. For example: Beat 1: Wide shot of a skateboarder riding downhill at sunset Beat 2: Slow motion close-up of wheels hitting the pavement Beat 3: Camera follows from behind as sparks fly from the board Even if your preferred AI video generator doesn't yet support full scene transitions, this approach helps guide the system toward more intentional storytelling. Movement is key. I started adding phrases like 'camera tilts up,' 'fog rolling in,' or 'wind rustling trees.' These made a surprising difference in realism and cinematic feel. So did mood cues like 'soft lighting,' 'overcast skies,' or 'neon glow.' AI video is still new, and results can be unpredictable. I ran each prompt multiple times to compare outputs, tweaking a word here and there. Veo's consistency is noticeably better than other tools I've tried. It doesn't always nail it, but when it does, the results can look like they came from a professional video shoot. The prompts below were used with Veo 3 so they also have sound. Landscape prompt: Wide establishing shot of a mountain range at golden hour, camera slowly panning left. Fog rolls between the peaks, soft ambient lighting, cinematic depth of field. Sports prompt: Slow-motion shot of a skateboarder jumping off a ramp at sunset. Camera follows from low angle as dust kicks up. Warm lighting, dynamic energy. Food reel prompt: Overhead shot of a steaming bowl of ramen being served in a Tokyo street market. Close-up of chopsticks lifting noodles, steam rising. Warm lighting, vibrant details. Action POV prompt: First-person POV of a cyclist riding down a forest trail. Leaves fly past, sun flickers through trees. GoPro-style realism, immersive movement. If your AI videos look generic or "like AI," your prompt may be to blame. With a few smart tweaks; focusing on structure, camera angles and specific action, you can get more polished, more cinematic results. Google Veo is one of the most capable tools I've tested, but like any AI, it's only as good as what you feed it. Want to try it yourself? Start small. Pick a scene, describe it like a shot list and watch what happens next. Let me know in the comments how everything came out!

Engadget
8 hours ago
- Engadget
Android 16 will protect users from fake cell towers and potential spying threats
It turns out that your smartphone could be an overlooked vulnerability that puts you at risk of being tracked. To combat this, Google is rolling out a new security feature in Android 16 that will warn users if their device is using a fake or insecure mobile network or if that network requests identifying information about a connected device. However, these features likely won't be available until the next generation of Android devices, as first reported by Android Authority . Since the current Android devices lack the hardware to support these features, the first compatible Android device we could see with this tech may be the Pixel 10 that's expected to debut later this summer. This feature is designed to counteract cell site simulators, or devices that act like a cell tower and trick nearby devices into connecting to it. Once connected, these simulators can glean sensitive information, like the location of a smartphone. These cell site simulators are better known by their commercial nickname, Stingray, and have been reportedly used by agencies like U.S. Immigration and Customs Enforcement, as well as Customs and Border Protection. The upcoming security features are rolling out as part of the latest Android OS update, which was released earlier this month. Compatible devices will have the option to toggle "network notifications" on or off, which will warn you if your device connects to an unencrypted network or when the connected network requests your phone's unique identifiers. On top of that, there's another option that lets you turn on "2G network protection" to avoid the less secure mobile network type.