Categories
Featured

I tested Samsung’s glare-free OLED TV vs a conventional OLED TV – here’s what I learned

[ad_1]

OLED is a much sought-after display technology in consumer products ranging from phones to  TVs. OLED TVs are consistently ranked as the best TVs, thanks to their unparalleled contrast, steadily improving brightness with each new generation of sets, dynamic color and refined detail. However, there is one area where OLED TVs suffer: reflections.

The pixels in an OLED display individually dim as required, making them capable of greater light control than LED and mini-LED TVs, which use a separate backlight. But OLED TVs have also lacked brightness compared to mini-LED TVs, and their dimmer screens mean reflections can become a real issue. In recent years, brightness-boosting micro-lens-array (MLA) tech has been introduced into some of the best OLED TVs such as the LG G3 and Panasonic MZ2000 to limit reflections. And while MLA has helped OLED TVs to become brighter, reflections remain a problem. 

[ad_2]

Source Article Link

Categories
Featured

‘We learned our lesson’: Marvel admits it’s made too many movies and TV shows for fans to keep up with

[ad_1]

Marvel has confirmed that it will make fewer movies and TV shows moving forward after a “rough” few years of theatrical flops and favoring quantity over quality.

Speaking to Empire Magazine, studio co-president Louis D’Esposito admitted that the Disney subsidiary had “learned its lesson” after it saturated the superhero genre market with projects that didn’t live up to fans’ lofty expectations. The comic book giant is now looking to reduce its output in theaters and on Disney Plus with the aim of regaining audiences’ trust after a period of diminishing box-office returns, poorly reviewed productions, and other criticisms.



[ad_2]

Source Article Link

Categories
Featured

Apple iPad event 2024 – 9 things we learned from the Let Loose event

[ad_1]

Tim Cook described today’s Apple iPad event as “the biggest day for iPad since its introduction” – and hype or not, he might have a point. After all, the Let Loose event saw the launch of new iPad Airs and iPad Pros, a new Apple M4 chipset, plus new accessories and software. No, it doesn’t compare to what we expect to see at WWDC 2024, but it was a pretty big deal.

So, what were the highlights – and lowlights – of the 40-minute presentation? Here’s everything we learned from the May 2024 Apple iPad event.

1. The new iPad Pro 2024 looks seriously powerful and impressively thin…

[ad_2]

Source Article Link

Categories
Entertainment

I guess I learned how to appreciate The Phantom Menace

[ad_1]

More than anything, Star Wars: Episode 1 – The Phantom Menace is a fascinating cultural object. It’s been 25 years since I saw the film in theaters, and over a decade since I last rewatched it (in a vain attempt to help my Trekkie wife catch up to the prequels). I’ve had enough time to process the initial disappointment and embarrassment of introducing my wife to Jar Jar Binks. So when Disney announced it was bringing the prequel trilogy back to theaters, I was practically giddy about revisiting them to see how George Lucas’s final films compared to the onslaught of Star Wars media we’ve experienced over the past decade. Was The Phantom Menace as bad as I’d remembered? Well, yes and no.

Star Wars: Episode 1 - The Phantom MenaceStar Wars: Episode 1 - The Phantom Menace

Disney/Lucasfilm

In 1999, I knew Episode 1 would be a bit of a slog as soon as we hit the second line of the opening crawl: “The taxation of trade routes to outlying star systems is in dispute.” Really, George? This was what Star Wars fans were waiting for since 1983’s Return of the Jedi? During this rewatch, I was more tickled than annoyed by the many baffling narrative choices: The empty drama of a trade blockade; the confusing decision to establish a romance between a literal child and an older teenager; and throwing in Jar Jar Binks to appease kids amid the hideously dull dialog.

It’s as if The Phantom Menace was written and directed by an alien who hadn’t actually seen a movie, or engaged in any aspect of pop culture, since the early ’80s. At the same time, that near-outsider perspective is part of the film’s charm. Seeing a society slowly lose control of an idealistic democracy to a power-hungry dictator is a lot for a PG-rated fantasy film. Yet that also sets up the first two prequels to feel eerily-prescient beside the global response to 9/11.

By the time we reached 2005’s Revenge of the Sith, the allusions to George W. Bush’s Patriot Act and Global War on Terror were hard to miss. “This is how liberty dies, with thunderous applause,” Padme says as her fellow Senators hand over emergency powers to Palpatine, turning Supreme Chancellor Palpatine into the Emperor, and transforming the Galactic Republic into the Galactic Empire.

Star Wars: Episode 1 - The Phantom MenaceStar Wars: Episode 1 - The Phantom Menace

Disney/Lucasfilm

Beyond political machinations, The Phantom Menace is filled with loads of gorgeous imagery: Naboo’s lush palace and aquatic Gungan city; the designs of new ships and weapons; and, of course, every single outfit worn by Princess Amidala. It would have been nice if these visuals cohered into the narrative better, but their presence makes it clear that Lucas was surrounded by world-class talent, like .

The Phantom Menace also leaps to life in its handful of action set-pieces. Sure, maybe the pod-race goes on a bit too long, but the sense of speed, scale and bombastic sound throughout is still absolutely thrilling. (The film’s sound team — Gary Rydstrom, Tom Johnson, Shawn Murphy and John Midgley — was nominated for an Oscar, but lost out to The Matrix.)

And yes, the entire Duel of the Fates fight is still an absolute banger. There’s no doubt that The Phantom Menace would have been a stronger film with less-clunky dialog and more character development shown through action. At one point in the fight, all of the participants are separated by laser barriers. Qui-Gon Jinn meditates, almost completely at peace. Darth Maul prowls like a caged lion. And Obi-Wan Kenobi is simply eager to get on with the fight, like a hot-shot student who just wants to show off. That sequence tells you more about those characters than the remaining two hours of the film.

Star Wars: Episode 1 - The Phantom MenaceStar Wars: Episode 1 - The Phantom Menace

Disney/Lucasfilm

While I didn’t come around to loving Jar Jar Binks during this rewatch, his very existence as a fully-CG character felt more significant than ever. Voiced by the actor and comedian Ahmed Best, Jar Jar was roundly trashed upon release and his implementation was far from seamless. But it was also the first time we saw a motion-captured performance be transformed into a fully-realized character. Now that technology is so common in movies we practically take it for granted.

“You can’t have Gollum without Jar Jar,” Best said in . “You can’t have the Na’vi in ‘Avatar’ without Jar Jar. You can’t have Thanos or the Hulk without Jar Jar. I was the signal for the rest of this art form, and I’m proud of Jar Jar for that, and I’m proud to be a part of that. I’m in there!”

In 2017, Best offered an expanded version of his thoughts in a Twitter thread (): “Jar Jar helped create the workflow, iteration process and litmus test for all CGI characters to this day. On some days the code was being written in real time as I was moving. To deny Jar Jar’s place in film history is to deny the hundreds of VFX technicians, animators, code writers and producers their respect. People like John Knoll, Rob Coleman and scores of others who I worked with for two years after principal photography was ended to bring these movies to you.”

Star Wars: Episode 1 - The Phantom MenaceStar Wars: Episode 1 - The Phantom Menace

Disney/Lucasfilm

A great story stuck in a bad film

I’ve learned the best way to watch The Phantom Menace is to take in the aspects that I like and replace Lucas’s many baffling choices with my own head canon. The story of Anakin Skywalker being born through the sheer power of the Force and becoming the Jedi’s Chosen One? That’s interesting! Inventing Midi-chlorians to give people a literal Jedi power score? That’s bad, to hell with you! (Midi-chlorians are still technically canon, but they’ve been largely ignored in recent Star Wars media.)

This time around, I couldn’t help but imagine how a more natural and energetic storyteller would have tackled The Phantom Menace. Surely they wouldn’t front-load trade disputes and taxation. A more skilled writer, like Andor’s Tony Gilroy, could thoughtfully weave together the Republic’s potential downfall. And I’d bet most people wouldn’t waste Ewan McGregor’s Obi-Wan by keeping him off-screen for an hour, while everyone else goes on a pod-racing adventure. (It sure would be nice to have him spend more time with Anakin!)

Star Wars: Episode 1 - The Phantom MenaceStar Wars: Episode 1 - The Phantom Menace

Disney/Lucasfilm

I still haven’t seen , but his decision to start in the middle of Phantom Menace’s climactic lightsaber battle makes sense. So much of Episode 1 feels entirely superfluous when the real story of Anakin Skywalker is about falling in love, being tempted by the Dark Side and ultimately betraying his master.

[ad_2]

Source Article Link

Categories
Featured

I used my DSLR for the first time in years since switching to mirrorless – here’s four things I learned

[ad_1]

Take the strain, and three, two, one, pull! No, I’m not in the gym lifting weights, but in the woods with my Nikon DSLR and raising its optical viewfinder to my eye to compose a picture. It’s my D800‘s first outing in years and it’s quickly reminding me why I was so happy to switch to mirrorless. At 31.7oz / 900g and combined with my Nikon 70-200mm AF-S f/2.8 VR lens (50.4oz / 1430g) it’s well over 80oz / 2300g, and being cumbersome isn’t even the worst part. 

Don’t get me wrong, I’ll come away from this walk in my local woods that’s bursting with fragrant bluebells and wild garlic with some pictures I’m super-excited about (see below), but boy do I have to work that much harder to get the results I want. And without wanting to lug a tripod around, I actually can’t get the same degree of sharpness in my pictures from this day in the dim conditions under a dense tree canopy. 

[ad_2]

Source Article Link

Categories
Featured

I tried Abbott’s new CGM Lingo for two weeks, and here’s what I learned

[ad_1]

I hit the plunger on the applicator device, and felt the needle slide into the meat of my arm, just below the tricep. Surprisingly, it was pretty painless. I removed the applicator and there it was: a plastic disk around 1.5 inches in diameter, which would sit on my arm for the next two weeks, broadcasting my blood sugar levels to my phone at all times. 

It was the morning before I went to meet the team behind Lingo, a smart continuous glucose monitor, for a healthy lunch during which I could monitor my glucose levels in real time. But what is a smart continuous glucose monitor (CGM)? Does it work, and is it worth it? Here are five things you need to know about one of the leading CGMs available right now, as well as a brief breakdown of the category.

What is a continuous glucose monitor? 

[ad_2]

Source Article Link

Categories
Featured

I meditated with Gwyneth Paltrow, and learned how she uses her Oura smart ring

[ad_1]

“Open your eyes, and observe all the objects of your surroundings,” said the voice of Gwyneth Paltrow, live, from LA. “now become aware of the empty space around your objects, and how the space allows everything to exist within it.”

It was odd to do a meditation breathing exercise with another person on Zoom. I was staring out the window at nothing in particular, picking a water spot on the glass as a focal point while the Iron Man franchise star and Goop owner talked me through the exercise. I was used to closing my eyes and sitting on a chair or cushion while I practiced mindfulness, but as my eyes glazed over, I could see how taking your practice anywhere could be very useful. 

[ad_2]

Source Article Link

Categories
News

ChatGPT and how Neural Networks learned to talk

ChatGPT and how Neural Networks learned to talk a 30 year journey

Thanks to the incredible advancements in neural networks and language processing computers can understand and respond to human language just as another person might. The journey from the first moments of doubt to the current state of achievement is a tale of relentless innovation and discovery. The Art of the Problem YouTube channel has created a fantastic history documenting the 30 year journey that has brought us to ChatGPT-4 and other AI models.

Back in the 1980s, the notion that machines could grasp the nuances of human language was met with skepticism. Yet, the evolution of neural networks from basic, single-purpose systems to intricate, versatile models has been nothing short of remarkable. A pivotal moment came in 1986 when Michael I. Jordan introduced recurrent neural networks (RNNs). These networks had memory cells that could learn sequences, which is crucial for language understanding.

The early 1990s saw Jeffrey Elman’s experiments, which showed that neural networks could figure out word boundaries and group words by meaning without being directly told to do so. This discovery was a huge step forward, suggesting that neural networks might be able to decode language structures on their own.

How Neural Networks learned to talk

Here are some other articles you may find of interest on the subject of neural networks :

As we moved into the 2010s, the push for larger neural networks led to improved language prediction and generation abilities. These sophisticated models could sift through massive data sets, learning from context and experience, much like how humans learn.

Then, in 2017, the Transformer architecture came onto the scene. This new method used self-attention layers to handle sequences all at once, effectively overcoming the memory constraints of RNNs. The Transformer model was the foundation for the Generative Pretrained Transformer (GPT) models.

GPT models are known for their incredible ability to learn without specific examples, following instructions and performing tasks they haven’t been directly trained on. This was a huge leap forward in AI, showing a level of adaptability and understanding that was once thought impossible.

ChatGPT, a variant of these models, became a tool that many people could use, allowing them to interact with an advanced language model. Its ability to hold conversations that feel human has been impressive, indicating the enormous potential of these technologies.

One of the latest breakthroughs is in-context learning. This allows models like ChatGPT to take in new information while they’re being used, adapting to new situations without changing their underlying structure. This is similar to how humans learn, with context playing a vital role in understanding and using new knowledge.

However, the rapid progress has sparked a debate among AI experts. Are these models truly understanding language, or are they just simulating comprehension? This question is at the heart of discussions among professionals in the field.

Looking ahead, the potential for large language models to act as the basis for a new type of operating system is significant. They could transform tasks that computers typically handle, marking a new era of how humans interact with machines.

The road from initial doubt to today’s advanced language models has been long and filled with breakthroughs. The progress of neural networks has transformed language processing and paved the way for a future where computers might engage with human language in ways we never thought possible. The transformative impact of these technologies continues to reshape our world, with the promise of even more astounding advancements on the horizon.

Filed Under: Technology News, Top News





Latest timeswonderful Deals

Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, timeswonderful may earn an affiliate commission. Learn about our Disclosure Policy.