Elgato's new 4k60 Facecam might just become our new favourite, featuring a new a lens, sensor, and analogue filters
When it comes to streaming and conferencing gear, Elgato is a company I've come to really appreciate. They make some of my favourite streaming gear like great keylights, and are also behind some of the best webcams around. The brand's Facecam line has impressed us since its inception back in 2021 with Elgato's first ever webcam and continues today with the Facecam MK.2 sitting atop our list of beloved cameras. Now, we are treated to the latest in that lineup as Elgato releases the Facecam 4k, and I'm only going off specs here, but it sounds like a really great webcam.
Right out the gate the Elgato Facecam 4k presents some impressive details to drool over. We see the same Elgato Prime lens which impressed us on the MK.2 paired with an improved 1/1.8" Sony Starvis 2 CMOS sensor. These guarantee that the image the camera is able to capture with its own optics is going to be very solid, and makes for a great foundation for the rest of the performance for this webcam.
These aren't the only interesting pieces of hardware featured, as Elgato has gone surprisingly analogue here. The Facecam 4k sports manual shutter-speed and ISO settings so you can dial in the light intake and speed to help make the best of your image. If you've not got the best lighting this should be a huge help in getting the best balance without things getting too grainy.
There's also physical lens filters. You know, those bits of coloured plastic you put infront of a lens to get different effects? Some of the younger crew may not be familiar with these, as usually filters are digital here in 2025, but they definitely exist. And now they can exist on your camera, which I think more creative streamers and content creators will have a lot of fun playing with. It will fit any 49-mm filter you can find, and Elgato are currently throwing some in for those who order from the webstore.
The 4k in this Facecams name isn't just for show. These can record HDR video up to 4k at 30 fps. You can also grab uncompressed video up to 4k 30 as well, as long as you've got the bandwidth in your USB. You can also drop the resolution down to 1080 if you're after higher framerates and generally dial this in to whatever suits your setup the best.
Through Elgato's Camera Hub software you can further dial a bunch of this stuff in and play with other effects. This includes full pan and tilt controls, digital presets, and if you're packing an Nvidia RTX card you can potentially access extra features like background blur.
If Elgato's Facecam 4k turns out to be as it says on the tin, I think we could have a contender for a new top webcam on our hands. It's launching at $200 USD, which also doesn't feel unreasonable for what actually looks like a really decent upgrade from the 1080p60 Facecam which still retails for $140. $60 for better sensors, upgraded lens, 4k capabilities, and all those sweet weird lens features sounds very worthwhile and I think this will become a staple for many streamers in the near future.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
3 hours ago
- CNET
ChatGPT Will Start Asking If You Need a Break. That May Not Be Enough to Snap a Bad Habit
We've all been mid-TV binge when the streaming service interrupts our umpteenth-consecutive episode of Star Trek: The Next Generation to ask if we're still watching. That may be in part designed to keep you from missing the first appearance of the Borg because you fell asleep, but it also helps you ponder if you instead want to get up and do literally anything else. The same thing may be coming to your conversation with a chatbot. OpenAI said Monday it would start putting "break reminders" into your conversations with ChatGPT. If you've been talking to the gen AI chatbot too long -- which can contribute to addictive behavior, just like with social media -- you'll get a quick pop-up prompt asking if it's a good time for a break. "Instead of measuring success by time spent or clicks, we care more about whether you leave the product having done what you came for," the company said in a blog post. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) Whether this change will actually make a difference is hard to say. Dr. Anna Lembke, a psychiatrist and professor at the Stanford University School of Medicine, said social media and tech companies haven't released data on whether features like this work to deter compulsive behavior. "My clinical experience would say that these kinds of nudges might be helpful for people who aren't yet seriously addicted to the platform but aren't really helpful for those who are seriously addicted." OpenAI's changes to ChatGPT arrive as the mental health effects of using them come under more scrutiny. Many people are using AI tools and characters as therapists, confiding in them and treating their advice with the same trust as they would that of a medical professional. That can be dangerous, as AI tools can provide wrong and harmful responses. Another issue is privacy. Your therapist has to keep your conversations private, but OpenAI doesn't have the same responsibility or right to protect that information in a lawsuit, as CEO Sam Altman acknowledged recently. Now Playing: How you talk to ChatGPT matters. Here's why 04:12 Changes to encourage "healthy use" of ChatGPT Aside from the break suggestions, the changes are less noticeable. Tweaks to OpenAI's models are intended to make it more responsive and helpful when you're dealing with a serious issue. The company said in some cases the AI has failed to spot when a user shows signs of delusions or other concerns, and it has not responded appropriately. The developer said it is "continuing to improve our models and [is] developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately and point people to evidence-based resources when needed." ChatGPT users can expect to see a notification like this if they're chatting with the app for long stretches of time. OpenAI Tools like ChatGPT can encourage delusions because they tend to affirm what people believe and don't challenge the user's interpretation of reality. OpenAI even rolled back changes to one of its models a few months ago after it proved to be too sycophantic. "It could definitely contribute to making the delusions worse, making the delusions more entrenched," Lembke said. ChatGPT should also start being more judicious about giving advice about major life decisions. OpenAI used the example of "should I break up with my boyfriend?" as a prompt where the bot shouldn't give a straight answer but instead steer you to answer questions and come up with an answer on your own. Those changes are expected soon. Take care of yourself around chatbots ChatGPT's reminders to take breaks may or may not be successful in reducing the time you spend with generative AI. You may be annoyed by an interruption to your workflow caused by something asking if you need a break, but it may give someone who needs it a push to go touch grass. Read more: AI Essentials: 29 Ways You Can Make Gen AI Work for You, According to Our Experts Lembke said you should watch your time when using something like a chatbot. The same goes for other addictive tech like social media. Set aside days when you'll use them less and days when you won't use them at all. "People have to be very intentional about restricting the amount of time, set specific limits," she said. "Write a specific list of what they intend to do on the platform and try to just do that and not get distracted and go down rabbit holes."


CNN
3 hours ago
- CNN
Korean Purdue student and daughter of priest released by ICE after 48 hours
We process your data to deliver content or advertisements and measure the delivery of such content or advertisements to extract insights about our website. We share this information with our partners on the basis of consent. You may exercise your right to consent, based on a specific purpose below or at a partner level in the link under each purpose. Some vendors may process your data based on their legitimate interests, which does not require your consent. You cannot object to tracking technologies placed to ensure security, prevent fraud, fix errors, or deliver and present advertising and content, and precise geolocation data and active scanning of device characteristics for identification may be used to support this purpose. This exception does not apply to targeted advertising. These choices will be signaled to our vendors participating in the Transparency and Consent Framework. The choices you make regarding the purposes and vendors listed in this notice are saved and stored locally on your device for a maximum duration of 1 year.


CNN
4 hours ago
- CNN
The desperate effort to save an injured Ukrainian soldier
We process your data to deliver content or advertisements and measure the delivery of such content or advertisements to extract insights about our website. We share this information with our partners on the basis of consent. You may exercise your right to consent, based on a specific purpose below or at a partner level in the link under each purpose. Some vendors may process your data based on their legitimate interests, which does not require your consent. You cannot object to tracking technologies placed to ensure security, prevent fraud, fix errors, or deliver and present advertising and content, and precise geolocation data and active scanning of device characteristics for identification may be used to support this purpose. This exception does not apply to targeted advertising. These choices will be signaled to our vendors participating in the Transparency and Consent Framework. The choices you make regarding the purposes and vendors listed in this notice are saved and stored locally on your device for a maximum duration of 1 year.