Categories
Entertainment

The European Union is investigating Meta’s election policies

[ad_1]

The EU has officially opened a significant investigation into Meta for its alleged failures to remove election disinformation. While the European Commission’s statement doesn’t explicitly mention Russia, Meta confirmed to Engadget the EU probe targets the country’s Doppelganger campaign, an online disinformation operation pushing pro-Kremlin propaganda.

Bloomberg’s sources also said the probe was focused on the Russian disinformation operation, describing it as a series of “attempts to replicate the appearance of traditional news sources while churning out content that is favorable to Russian President Vladimir Putin’s policies.”

The investigation comes a day after France said 27 of the EU’s 29 member states had been targeted by pro-Russian online propaganda ahead of European parliamentary elections in June. On Monday, France’s Ministry of Foreign Affairs Jean-Noel Barrot urged social platforms to block websites “participating in a foreign interference operation.”

A Meta spokesperson told Engadget that the company had been at the forefront of exposing Russia’s Doppelganger campaign, first spotlighting it in 2022. The company said it has since investigated, disrupted and blocked tens of thousands of the network’s assets. The Facebook and Instagram owner says it remains on high alerts to monitor the network while claiming Doppelganger has struggled to successfully build organic audiences for the pro-Putin fake news.

Mark Zuckerberg onstage during a company keynote presentation. Profile view from his left side.Mark Zuckerberg onstage during a company keynote presentation. Profile view from his left side.

Meta

The European Commission’s President said Meta’s platforms, Facebook and Instagram, may have breached the Digital Services Act (DSA), the landmark legislation passed in 2022 that empowers the EU to regulate social platforms. The law allows the EC to, if necessary, impose heavy fines on violating companies — up to six percent of a company’s global annual turnover, potentially changing how social companies operate.

In a statement to Engadget, Meta said, “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

The EC probe will cover “Meta’s policies and practices relating to deceptive advertising and political content on its services.” It also addresses “the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the elections to the European Parliament.”

The latter refers to Meta’s deprecation of its CrowdTangle tool, which researchers and fact-checkers used for years to study how content spreads across Facebook and Instagram. Dozens of groups signed an open letter last month, saying Meta’s planned shutdown during the crucial 2024 global elections poses a “direct threat” to global election integrity.

Meta told Engadget that CrowdTangle only provides a fraction of the publicly available data and would be lacking as a full-fledged election monitoring tool. The company says it’s building new tools on its platform to provide more comprehensive data to researchers and other outside parties. It says it’s currently onboarding key third-party fact-checking partners to help identify misinformation.

However, with Europe’s elections in June and the critical US elections in November, Meta had better get moving on its new API if it wants the tools to work when it matters most.

The EC gave Meta five working days to respond to its concerns before it would consider further escalating the matter. “This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries,” EC President von der Leyen wrote. “If we suspect a violation of the rules, we act.”

[ad_2]

Source Article Link

Categories
Computers

Meta’s Ray-Ban Smart Shades Get a Fresh Blast of AI

[ad_1]

Meta’s newest smart glasses, developed in partnership with Ray-Ban, have been newly fleshed out with more AI features. This week, Meta started rolling out an over-the-air update to its second generation of smart sunglasses that gives the wearables some new capabilities.

The biggest update is the Meta AI with Vision feature, which incorporates Meta’s ChatGPT-enabled AI assistant into the spectacles. Owners of the smart glasses will be able to activate an AI voice assistant, fiddle with (nearly) real-time translation, and identify stuff in the wearer’s vision. It all sounds very futuristic for sunglasses, though users have reported that, like all these newfangled AI systems, some features work better than others.

Other new features in the update include video calling in WhatsApp or Facebook Messenger apps and the ability to share the wearer’s view, shot from the glasses’ two front-facing cameras. The glasses also come in two new frame styles: the lower bridge Headliners and the cat-eye Skylers.

Walking around with a pair of cameras strapped to your face might still feel a little dystopian, but the fact that they look like regular old Ray-Bans makes the Meta shades blend into daily life more than the smart glasses of old like Google Glass. So yes, it certainly gets points for style, much like Mark Zuckerberg and his new chain obsession these days. But it’s also worth taking a moment to remember that these sick shades are coproduced by a company that has a history of letting its users’ data fall into the wrong hands. You’ll look dope in them, sure, but you’ll also be giving Meta first dibs on all the new parts of your life you’re capturing.

Here’s some other consumer technology news from this week.

Bag Your Recycling

Freitag backpack

The new Freitag Mono[PA6] bag.

Photograph: Freitag

Freitag, the Swiss company known for making upcycled bags and backpacks, has a slick new black sack. The Mono[PA6] Backpack can hold up to 24 liters of stuff and comes with a smaller detachable musette that can be worn like a sling or purse. The company says every bit of the bag is made from a single nylon material (polyamide 6). Everything from the flaps, straps, and zippers are cobbled together from that single base compound. That means you can send it back to Freitag, where the company can fully break it down and recycle the material to make another bag. The new piece retails for $380.

[ad_2]

Source Article Link

Categories
Featured

Meta’s massive OS announcement is more exciting than a Meta Quest 4 reveal, and VR will never be the same again

[ad_1]

Meta has announced that its Meta Horizon OS will no longer be exclusive to its Quest headsets (such as the incredible Meta Quest 3), and this might be the most important news we’ll see in the XR space this decade.

It’s an announcement I’ve been expecting for years – for reasons I’ll outline below – but the short version is Meta has started to turn its Horizon OS into the Windows of the spatial computing era; it’s even complete with a rival Apple OS (the Apple Vision Pro’s VisionOS) and a dash of irony given one of the first three non-Quest systems will be Xbox branded (Xbox is Microsoft’s gaming department for those not in the know).

[ad_2]

Source Article Link

Categories
Featured

Meta’s recent Quest 3 update includes a secret AI upgrade for mixed reality

[ad_1]

Meta’s VR headsets recently received update v64, which according to Meta added several improvements to their software – such as better-quality mixed-reality passthrough in the case of the Meta Quest 3 (though I didn’t see a massive difference after installing the update on my headset).

It’s now been discovered (first by Twitter user @Squashi9) that the update also included another upgrade for Meta’s hardware, with Space Scan, the Quest 3’s room scanning feature, getting a major buff thanks to AI.



[ad_2]

Source Article Link

Categories
Featured

Can you really meditate in VR? I tried Headspace XR at Meta’s London HQ

[ad_1]

I’m trying to breathe slowly, relaxing my shoulders and following the visual cues inside a pastel-colored world bathed in an orange sunset. It was almost easy to forget I was being watched carefully by several Meta and Headspace representatives, like a sort of laboratory experiment. 

Trying to act natural and relaxed, while being watched and analyzed by unseen observers? My mind forgot to be quiet for a moment, instead conjuring up the image of myself in a police interrogation room, and I tried to suppress a snort. 

[ad_2]

Source Article Link

Categories
News

Meta’s new CodeLlama 70B performance tested

Meta's new CodeLlama 70B performance tested

Meta AI has this week released CodeLlama 70B a new large language model specifically designed to assist developers and coders.  The new AI coding model has and impressive 70 billion parameters but is capable of being run locally. This model is designed to handle a wide range of tasks, from language processing to complex problem-solving. It’s a sophisticated tool that’s capturing the attention of developers and businesses alike. But how does it compare to other AI models, such as the Deep Seek Coder, which has 33 billion parameters? Let’s dive into a detailed performance evaluation of these two AI powerhouses.

When you first start working with CodeLlama 70B, you’ll notice it’s not as straightforward as some other models. It has a unique way of interpreting prompts, which means you’ll need to spend some time getting used to its system. The model uses a tokenizer to translate your input into a format it can understand, which is crucial for getting the most out of its capabilities. This includes learning how to use new source tokens and a ‘step’ token that helps with message formatting. Mastering these elements is essential if you want to fully leverage what CodeLlama 70B has to offer.

CodeLlama 70B performance tested

However, the advanced nature of CodeLlama 70B comes with its own set of demands, particularly when it comes to hardware. The model’s size means it needs a lot of VRAM, which could require you to invest in more powerful equipment or consider renting server space. This is an important consideration for anyone thinking about integrating this model into their workflow. Despite these requirements, CodeLlama 70B is exceptional when it comes to generating structured responses that are in line with validation data. check out the performance testing of CodeLlama 70B kindly carried out by Trelis Research providing a fantastic overview of what you can expect from the latest large language model to be rolled out by Meta AI.

Here are some other articles you may find of interest on the subject of Llama AI models :

When we put CodeLlama 70B to the test with specific tasks, such as reversing letter sequences, creating code, and retrieving random strings, the results were mixed. The model has built-in safeguards to ensure that outputs are safe and appropriate, but these can sometimes restrict its performance on certain tasks. However, these safety features are crucial for maintaining the model’s overall reliability.

For those who are interested in using CodeLlama 70B, it’s a good idea to start with smaller models. This approach allows you to create a more manageable environment for testing and development before you tackle the complexities of CodeLlama 70B. This model is really meant for production-level tasks, so it’s important to be prepared. Fortunately, there are resources available, such as one-click templates and a purchasable function calling model, that can help ease the transition.

CodeLlama 70B stands out in the field of AI for its advanced capabilities and its strong performance in adhering to validation data. However, the practical challenges it presents, such as its size and VRAM requirements, cannot be overlooked. By beginning with smaller models and utilizing available resources, you can prepare yourself for working with CodeLlama 70B. This will help ensure that your projects meet the highest quality standards and that you can make the most of this powerful AI tool.

Filed Under: Guides, Top News





Latest timeswonderful Deals

Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, timeswonderful may earn an affiliate commission. Learn about our Disclosure Policy.

Categories
News

How to use AudioBox Meta’s new text-to-sound AI tool

How to use Meta AudioBox new text to sound AI tool

Meta has this month unveiled a new AI sound generator called AudioBox, which is set to transform the way we interact with sound. This innovative tool allows users to convert text into speech, compose music, and create sound effects with ease, using simple text prompts. The introduction of AudioBox marks a significant step forward in making the creation of custom audio content more accessible to a wide range of users.

Meta’s Audiobox text-to-sound audio creation system is the successor to Voicebox. Advancing generative AI for audio even further by unifying generation and editing capabilities for speech, sound effects (short, discrete sounds like a dog bark, car horn, a crack of thunder, etc.), and soundscapes, with a variety of input mechanisms to maximize controllability for each use case.

AudioBox comes packed with a variety of features that meet a wide array of audio generation needs. For instance, its text-to-speech capability provides users with the ability to turn written text into realistic speech, offering a range of voice options to choose from. Those interested in music can use AudioBox to craft music tracks without needing to master traditional instruments or complex software. Additionally, the tool is capable of producing tailored sound effects, which can be particularly useful for gaming, film, and other multimedia projects. Users can customize audio outputs to their specific needs using intuitive text or audio prompts.

Text to sound AI audio generation

Meta has designed Audiobox  to enable people to create sounds using natural language prompts to describe the sound or type of speech they want to create. For example if you would like to create a new sound, simply enter a text prompt like, “A running river and birds chirping” into the AI model.  Watch the video below for an overview and demonstration of its current capabilities.

Here are some other articles you may find of interest on the subject of AI sound

Moreover, AudioBox is not just a basic sound generator; it includes advanced features that push the boundaries of AI-generated audio. One such feature is voice cloning, which allows the duplication of any voice from a sample, offering a personalized audio creation experience. The tool can also restyle existing audio to fit different contexts and edit or replace segments of audio seamlessly with AI-generated content, a process known as audio inpainting.

“Audiobox demonstrates state-of-the-art controllability on speech and sound effects generation. Our own tests show it significantly surpasses prior best models (AudioLDM2, VoiceLDM, and TANGO) on quality and relevance (faithfulness to text description) in subjective evaluations. Audiobox outperforms Voicebox on style similarity by over 30 percent on a variety of speech styles.”

Availability and pricing

Meta is currently making Audiobox available to a hand-selected group of researchers and academic institutions with a track record in speech research to help further the state of the art in this research area. The company is committed to ensuring that AudioBox is used ethically and responsibly. The company has implemented safeguards to prevent potential misuse and ensure that the AI adheres to moral guidelines. This commitment to ethical use is further demonstrated by a grant application that supports research into the safe application of AudioBox.

Another exciting feature of AudioBox is the AudioBox Maker, which allows users to construct complex audio scenes by layering sounds and music. This enables the creation of sophisticated and immersive soundscapes that can enhance any audio experience.  Meta’s Audiobox interactive demo and research paper are now available allowing you to test out the new foundation research model for audio generation.

AudioBox is poised to make a significant impact on the audio production industry. Its comprehensive features and dedication to ethical use mean that content creators, musicians, and developers can look forward to a new realm of possibilities. As we await further updates on AudioBox, including its potential open-source release and the outcomes of ongoing safety and responsibility research, it’s clear that this tool is set to become an indispensable asset in the world of audio production.

Filed Under: Guides, Top News





Latest timeswonderful Deals

Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, timeswonderful may earn an affiliate commission. Learn about our Disclosure Policy.