Knowing where to pre-order Star Wars Outlaws is now a hot topic, with the game’s release date out in the wild – it arrives on PS5, Xbox Series X, and PC on August 30.
Plenty of retailers are getting in on the listing action already and there’ll likely be more to come out of the woodwork soon. So getting to the right places efficiently to spend your hard-earned cash is paramount – and that’s where we come in as we can direct you to the best place to pre-order Star Wars Outlaws quickly.
The main thing to know is that there are three versions to choose from for the game on PS5, Xbox Series X, and PC. The Standard Edition gets you the main game, and a cosmetic pack as the pre-order bonus; the Special Edition bags you a few more digital goodies; while the Gold Edition throws in the season pass and some early access content too. For full details on what’s included, jump down to our dedicated section on what’s included with each edition.
Anyway, let’s stop beating around the bush and get to it. Below are all the Star Wars Outlaws pre-order editions and the best links to use to snag your copy.
Star Wars Outlaws pre-orders – PS5 US
Star Wars Outlaws pre-orders – PS5 UK
Star Wars Outlaws ore-orders – Xbox Series X
Star Wars Outlaws pre-orders – Xbox Series X UK
Star Wars Outlaws pre-orders – What’s in each edition?
(Image credit: LucasFilm Games/Ubisoft)
With multiple Star Wars Outlaws pre-order editions and options, there’s also a range of digital bonuses and levels of access that come with each level or tier of edition. If you’re looking for a quick overview to inform your choice then we’ve got you covered.
Starting simply, if you pre-order any version of Star Wars Outlaws you’ll get the following:
The base game
The Kessel Runner Bonus Pack that includes the Spaceship Cosmetic and Speeder Cosmetic Pack
(Image credit: Lucas Film Games/Ubisoft)
The Special Edition, sitting in the middle of the variants available, and being sold by a couple of retailers only, gets you the standard version of the game and a few extras to take your full package to the following list of items:
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
The base game
The Kessel Runner Bonus Pack that includes the Spaceship Cosmetic and Speeder Cosmetic Pack
The Sabacc Shark Character Pack, which includes cosmetics for Kay, her blaster, and Nix
(Image credit: LucasFilm Games/Ubisoft)
The Gold Edition gets you everything in the previous two variants, the same pre-order bonuses, and also throws in a season pass to get you access to the DLC that will follow the main release, as well as three days of early access to the game. The full list for this version is as follows:
The base game
The Kessel Runner Bonus Pack that includes the Spaceship Cosmetic and Speeder Cosmetic Pack
The Sabacc Shark Character Pack, which includes cosmetics for Kay, her blaster, and Nix
Season Pass which has 2 upcoming story pack DLCs and an instant Day 1 mission and cosmetics.
Three days of early access to the game (play before the official release date)
I had three flagship phones on three different tripods all aimed at a sun rapidly being crowded by a nuisance moon, and all I wanted was one or two excellent eclipse shots.
Turns out that photographing a solar eclipse with your smartphone is not that easy. In fact, figuring out a repeatable process without cauterizing your retinas is downright challenging. But I did it. I grabbed some of the best smartphones money can buy, the iPhone 15 Pro Max, Google Pixel 8 Pro, and the Samsung Galaxy S23 Ultra, and prepared for 180 minutes of celestial excitement.
That last selection might turn a few heads. It is, after all, a now aging flagship Android phone that does not have the latest image processing or even the fastest Qualcomm Snapdragon 8 Gen 3 chip found in the Galaxy S24 Ultra (the S23 Ultra has the Gen 2). However, one thing it has that none of my other flagship smartphones offer is a 10X optical zoom (not even the S24 Ultra has that).
Throughout this endeavor I committed to not using any enhancements, leaving the phones’ zoom lenses to do their best work without digital magic. I never pinched and zoomed. I just pointed each phone at the eclipse and hit the shutter.
Making an adjustment
(Image credit: Future / Lance Ulanoff)
Except as soon as I did this, I realized it wasn’t going to work. The sun naturally blows out the exposure on all the phones. It’s not that I haven’t taken pictures of the sun before. I’ve snapped quite a few with the iPhone and to get over the blowout, I tap the sun on screen and that speeds up the exposure to lower the light and bring out the sun’s definition.
An eclipse wreaks havoc with a smartphone’s exposure controls, and the more the moon occludes the sun, the sharper that light becomes. My solution was simple and likely one you’ve seen elsewhere. I took my Celestron eclipse glasses and carefully placed the film of one sunglass lens over each phone’s zoom lens. If you ever have trouble identifying which camera is the zoom, just open the camera app, select the max optical zoom, and put your finger over each camera lens until you see your finger on the screen.
Three phones, three tripods (Image credit: Future / Lance Ulanoff)
The solar sunglasses helped with cutting down the massive glare. After that, I tapped on the screen and adjusted the exposure until I could see the sun getting the Pac-man treatment from the moon. In most cases, the result was a very orange-looking sun.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
For the next hour or so, I shifted from one phone to the other, repositioning my tripods, lining up the sun, and snapping away.
There were some non-smartphone-related glitches, like cloud cover right before our peak totality (90% where I live) but I was more successful than I expected and the smartphones, for the most part, were up to the challenge.
Not all smartphone cameras are created equal
As you can see, the Ultra’s 10X zoom gets you closer. (Image credit: Future / Lance Ulanoff)
You’ll see some of my comparisons above and below (I’ve used the best from all the phones in the above shots) which I did not resize or enhance, other than cropping them where possible to show them side-by-side.
While the iPhone 15 Pro Max and Pixel 8 Pro shoot at 12MP (the latter is binned from a 48MP sensor, meaning four pixels combined into each one), the Samsung Galaxy S24 Ultra’s 10X zoom camera is only 10MP. I think those numbers do factor into the overall quality.
(Image credit: Future / Lance Ulanoff)
The Google Pixel 8 Pro matched the iPhone 15 Pro Max’s 5x zoom and sometimes seemed sharper than either the iPhone or Galaxy S23 Ultra, but I also struggled the most with the Pixel 8 to capture a properly exposed shot. It was also the only phone that forced a long exposure after the peak 90% coverage. The good news is that some of those long exposures offered up the most atmosphere, managing to collect some of the cloud cover blocking my full view of the eclipse.
(Image credit: Future / Lance Ulanoff)
Things got more interesting with the iPhone 15 Pro Max and its 5x Tertrapism lens. The eclipse appears a little closer than on the Pixel 8 Pro, but also more vibrant. There are a handful of iPhone 15 Pro Max pictures where I can see the clouds and it’s quite beautiful. As with all the phones, this image capture process was a bit hit-and-miss. Colors shifted from orange to almost black and white, and sticking the focus was a challenge. When I did manage to capture a decent photo, I was thrilled.
One of the Google Pixel 8 Pro’s best eclipse photos. (Image credit: Future / Lance Ulanoff)
The Samsung Galaxy S23 Ultra’s 10x optical zoom pulled me thrillingly close to the eclipse. It was certainly easier to get the exposure and focus right. At a glance, the S23’s images are better but closer examination reveals significant graininess, so much so that some appear almost like paintings and rough canvas.
As I dug deeper into all the photos, I noted how each phone camera used ISO settings to manage the image capture and quality. The iPhone 15 Pro Max ranged from ISO 50 (very slow light capture) to ISO 800 (super fast for ultra-bright situations and action shots). Naturally, those at the upper end of the spectrum are just as grainy as those from the Galaxy S23 Ultra, which ranges from as low as ISO 250 to 800.
Sometimes the comparison came down to a matter of taste. (Image credit: Future / Lance Ulanoff)
The Google Pixel 8 Pro has the widest range from as low as ISO 16 to an astonishing ISO 1,536. It used that for a capture of the 90% eclipsed sun behind clouds. Aesthetically, it is one of the better shots.
If I had to choose a winner here, it would be the Samsung Galaxy S23 Ultra by a nose. That extra optical zoom means you have more detail before the graininess kicks in.
The iPhone 15 Pro Max is a very close second, but only because it was easier to capture a decent shot. I also think that if it had a bigger optical zoom, the iPhone’s powerful image processing might’ve outdone the year-old Galaxy.
Probably my favorite iPhone 15 Pro Mac eclipse shot. (Image credit: Future / Lance Ulanoff)
Google Pixel Pro 8 has some great shots but also a lot of bad ones because I couldn’t get it to lock in on the converging sun and moon. It also suffered the most when it came to exposure. Even so, I am impressed with the ISO range and the sharpness of some shots.
The iPhone 15 Pro Max and Google Pixel 8 Pro also deserve special mention for producing my two favorite shots. They’re not the closest or clearest ones, but by capturing some of the clouds, they add an ethereal, atmospheric element.
If I live long enough to see another eclipse (there’s one in the American Midwest in 2044), I’ll look for special smartphone eclipse filters and give it another try. By then we could well have 200x optical zoom cameras with 1,000MP sensors.
I hit the plunger on the applicator device, and felt the needle slide into the meat of my arm, just below the tricep. Surprisingly, it was pretty painless. I removed the applicator and there it was: a plastic disk around 1.5 inches in diameter, which would sit on my arm for the next two weeks, broadcasting my blood sugar levels to my phone at all times.
It was the morning before I went to meet the team behind Lingo, a smart continuous glucose monitor, for a healthy lunch during which I could monitor my glucose levels in real time. But what is a smart continuous glucose monitor (CGM)? Does it work, and is it worth it? Here are five things you need to know about one of the leading CGMs available right now, as well as a brief breakdown of the category.
What is a continuous glucose monitor?
Continuous glucose monitors (CGMs) have been used in medical settings for a while, mostly to track the blood sugar levels of diabetics, but they’re slowly becoming trendy smart wearable devices that allow anyone to monitor their blood sugar. You can read our comprehensive breakdown here on what CGMs are, and whether they’re right for you, along with our coverage of other CGMs such as the Ultrahuman M1.
Our blood sugar levels are closely tied to weight management, energy levels, sleep, digestion, stress, and lots more. Research from Pennsylvania State University, among others, has associated poor glucose management with weight gain.
Lingo is a smart CGM that broadcasts your blood sugar live to the Lingo app and uses the term “metabolic health” to offer a window into what’s happening in your body. Just eat a chocolate bar? You’ll be able to see a spike in blood sugar by opening the app and viewing your daily timeline. Just exercised, using up a lot of energy? You may see a dip after your session.
By helping balance out your spikes and dips with your changes in movement, diet, and overall habits, Lingo and other CGMs like it, aim to supercharge your health and energy levels.
(Image credit: Future)
1. It’s built from medically-certified software
Lingo is the first commercially available CGM from Abbott, a medical company that has previously supplied CGMs for diabetics in medical settings and other commercial partners.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Sarah Tan, EMEA General Manager at Lingo and the person responsible for bringing Lingo into stores in the UK, Europe, and elsewhere, said: “Lingo comes from 20 years of glucose monitoring and R&D… technology that is used by more than five million people every day to managing diabetes. There’s a good chance if you’re using another continuous glucose monitor on another app or another platform, it’s built on technology brought to you by Abbott.” The company has previously sold CGMs under its Freestyle Libre brand.
This is actually quite reassuring: in an era in which features on the best smartwatches have to fight to get FDA approval and be used in medical settings, buying a product from a medical-first company is, at least, a stamp of quality. That did reassure me slightly when I looked at the needle I was about to stick into my arm.
(Image credit: Future)
2. It learns about you
“We know glucose management is not something people are chatting about to each other on the street,” said Tan, “but there is a conversation about personalized health. People are time-poor, attention-poor, so to say to somebody, ‘what better way to take control of your health than knowing what’s going on inside your body at any point in time?’ That’s what we’re working towards.”
Just like the best smart rings, Fitbits, and other health-forward devices, Lingo collects data and uses it to better personalize the advice and recommendations it serves you. Your blood glucose spikes and dips are collected during the day and served up to you in an easy-to-read “Lingo Count” consisting of a double-digit number, along with personalized nuggets of advice.
For example, if you experience big spikes after lunch, Lingo might suggest you get in some light exercise after eating, which can help to flatten that post-prandial blood sugar increase.
3. It helps you adjust your behavior.
The example above is one way Lingo alters your behavioral patterns. The advice it follows is based on Lingo’s five key principles: prioritize protein, eat dietary fats, eat more fibrous green veggies, opt for savory over sweet things, and move more.
Lily Soutter, a nutritionist at Lingo, said: “This is where a biowearable like Lingo can come into play and provide that window into our bodies like never before. Not only do we see immediate benefits when managing our bloodstreams and food, but using Lingo can help us to develop those long-term healthy habits like managing stress, getting more exercise into our lives, getting more sleep. This can shift our focus for more preventative care.”
I am not the target audience for a weight-management tool. I am 5’10”, 68 kilograms and change, and I’m generally above the average level of fitness for my age group. So fortunately, even though I have a bit of a sweet tooth, I didn’t see the need to make many changes after living with Lingo for 10 days or so.
There were, however, a few points of interest for me: for example, after a heavy training session, I could see my blood sugar drop sharply on my timeline, so loaded up on lunch (with extra carbohydrates) to bring my energy levels back to a stable medium.
I also saw that unbeknownst to me, I experienced blood sugar spikes at night while I was asleep. Lingo didn’t seem to register anything wrong, and my count stayed steady at 64. But still, mildly curious and slightly concerned, I headed to the internet. The Sleep Foundation website said that “our bodies experience a cycle of changes every day—called a circadian rhythm—which naturally raises blood sugar levels at night and when a person sleeps. These natural blood sugar elevations are not a cause for concern.”
4. It’s a premium product
(Image credit: Future)
Lingo is not cheap. Above are the contents of a one-month box, including two non-reusable smart CGMS and sticky waterproof ‘shields’ to cover them with.
A two-month supply of non-reusable CGMS, so two boxes, costs £300, which equates to around $380 / AU$575. On a subscription, this will auto-renew at £120 per month, so for your first year, you’ll spend £1500, or around $1,900 / AU$2,900. That’s a big commitment, even for a bleeding-edge bio-wearable.
5. It’s not for everyone
As mentioned above, even though I don’t need to worry about weight loss, I had the ability to tweak my behavioral patterns to ensure my energy levels remained steady, and I had the option to investigate a spike I was unaware of. I can see how Lingo might be useful to those on the verge of being prediabetic, acting as a teaching tool to reveal previously unseen consequences of unhealthy habits. It would allow prediabetics to get their glucose levels under control before suffering from insulin resistance.
Lingo’s repackaging of a diabetes tool as a cutting-edge “metabolic health tracker” is certainly going to be appealing to the ultra-optimizing biohacking crowd as well as people who are looking to get their weight under control.
However, generally healthy people who already exercise regularly and eat a balanced diet aren’t going to get enough out of this wearable, in my opinion, to justify the ultra-premium price tag. It may be more popular as a temporary subscription, a sort of checkup for two months to reinforce healthy habits and provide answers to the curious, but ongoing subscriptions are going to cost a great deal and, perhaps, not add enough value for those who already have their glucose under control.
On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.
During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.
NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.
All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.
NASA’s Scientific Visualization Studio
Help measure the shape of the sun
One such citizen science project is , a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily’s Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.
The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a . “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.
You’ll need to download the free SunSketcher app, which is available for iOS and Android on the and . Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.
There’s a on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.
The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).
NASA / Aubrey Gemignani
Record changes in your surroundings
Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.
Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that experienced a less extreme drop in surface temperatures.
To participate this time around, download the Globe Observer app from the or , and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.
You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.
Listen to the sounds of wildlife
Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.
To be an for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.
If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a . You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.
NASA
Take photos of the solar corona
The is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.
The is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.
However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.
Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with , along with the percentage of sun coverage you can expect to witness.
Apple this week made the first betas of iOS 17.5 and iPadOS 17.5 available to developers for testing. The upcoming software updates include only a few new user-facing features so far, but hidden code changes reveal some additional possibilities.
Below, we have recapped everything new in the first iOS 17.5 and iPadOS 17.5 betas so far.
Web Distribution
Starting with iOS 17.5, iPhone users in the EU will be able to download iOS apps directly from the websites of select developers.
“Web Distribution” will be limited to larger developers based in the EU. To qualify for this distribution method, Apple says the developer must be a member of the Apple Developer Program for two continuous years or more, and have an app that had more than one million annual installs on iOS in the EU in the prior calendar year.
“Web Distribution” builds upon the alternative app marketplaces that Apple already allows on the iPhone in the EU starting with iOS 17.4. Apple has made these app-related changes to comply with the EU’s Digital Markets Act.
Color-Changing Apple Podcasts Widget
While listening to a podcast on the iOS 17.5 beta, the background color of the Apple Podcasts widget changes to match the podcast’s cover art. This feature was first added in the iOS 17.4 beta, but it was removed before that update was released.
A sample of the code strings added in the iPadOS 17.5 beta:
“iPad must be regularly used while not connected to power to show maximum capacity.”
“This is the number of times iPad has used your battery’s capacity.”
“The iPad battery is performing as expected.”
“iPad batteries, like all rechargeable batteries, have a limited lifespan and may eventually need to be serviced or replaced.”
“The original battery was designed to retain X capacity at X cycles under ideal conditions. Actual battery performance depends on a number of variables, including how iPad is used and charged regularly. The one-year warranty includes service for defective batteries in addition to rights provided under local consumer laws.”
There is no visible Battery Health menu on any existing iPad models on the first iPadOS 17.5 beta, leading us to believe that the menu may be limited to the new iPad Pro and iPad Air models that are rumored to launch in May, and likely other new iPads released in the future. Apple has shown battery capacity information on iPhones for many years, but cycle count information is currently limited to the latest iPhone 15 series.
New Apple Pencil
The first beta of iOS 17.5 potentially references a new fourth-generation Apple Pencil, amid rumors that the accessory will be updated soon.
The beta includes a hidden code reference to a “V4” version of the Apple Pencil:
A new Apple Pencil is rumored to launch alongside updated iPad Pro and iPad Air models in May. Apple refers to the USB-C version of the Apple Pencil released last year as the third-generation Apple Pencil in iOS code, so the fourth-generation model would be a new version that would likely succeed the second-generation Apple Pencil.
Additional code in the iOS 17.5 beta suggests that the Apple Pencil could gain a “squeeze” gesture for certain actions, but details are slim.
Third-Party Item Tracker Alerts
Apple and Google last year jointly announced a proposed industry specification to help combat the misuse of Bluetooth item trackers for unwanted tracking of individuals. As part of this initiative, Apple promised to expand AirTag-like “Found Moving With You” alerts to third-party item trackers in a future software update, which may be iOS 17.5.
The first iOS 17.5 beta adds references to alerts for third-party item trackers. For example: “You can disable this item and stop it from sharing its location with the owner. To do this, follow the instructions provided on a website by the manufacturer of this item.”
Tile, Chipolo, Samsung, Eufy, and Pebblebee all expressed support for the industry specification, according to Apple’s announcement last year.
Block All Participants in Group FaceTime Calls
iOS 17.5 beta code indicates that there may be a new “Block All Participants” option for group FaceTime calls, which could help users to fight spam. We have not confirmed if the feature is functional yet in the first beta.
Wrap Up
iOS 17.5 and iPadOS 17.5 will likely be released to the public in May.
If you find any other new features or changes in the iOS 17.5 and iPadOS 17.5 betas, let us know in the comments section, or by emailing [email protected].
Google just released a new version of ChromeOS which comes laden with some impressive improvements, including the ability to implement custom keyboard shortcuts and to do the same with your mouse buttons.
ChromeOS M123 delivers these new powers, and more besides, but the ability to actually define your own keyboard shortcuts will be the most welcome feature for owners of the best Chromebooks.
If you’re familiar with working a certain way with a particular shortcut, you can now change over to that – and as Google points out (via The Verge), you can also change shortcuts to, say, make it easier to trigger them using just one hand (stretch that thumb and finger).
Similarly, any of your mouse buttons can be redefined to trigger various functions outside of basic clicking, like taking a screenshot. That’s going to be really handy for those who have a mouse with plenty of side buttons.
With the latest update for ChromeOS, tethering is now a thing, allowing the Chromebook to share its cellular connection with other devices. There’s now a switch in Network Settings to turn on the Hotspot, which is all you need to do before searching for and finding the network on your other device(s).
Finally, ChromeOS 123 has new voices for its text-to-speech functionality – reading out text aloud – that are more natural sounding. These work offline and Google notes that they are available in 31 different languages.
Analysis: Some handy additions, with the odd catch
(Image credit: Future)
There are some very useful goodies here, particularly for those newer to ChromeOS who are maybe more accustomed to using a Mac or Windows PC.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Those users will have the shortcuts from those desktop operating systems ingrained in their muscle memory, no doubt, and so getting used to a whole new way of working might be an uphill struggle. Luckily, there’s no need to relearn anything now, as you can simply switch the default ChromeOS shortcuts to whatever you’re used to.
The tethering feature could be pretty handy as well, although there is a slight catch here. Right now, the only network supported is T-Mobile in the US, so unless you have your phone with that provider, then you’re out of luck.
However, Google says it is “working to add other networks in future releases” and we’d imagine it won’t be too long before support is expanded. If you’re not seeing the update yet, fear not – Google says it’s going to be “progressively rolling out over the coming days”, so you should see it soon if your Chromebook supports it.
Apple TV+ has dozens of classic films for subscribers to watch. And it just added to the collection. Photo: Apple TV+
Apple TV+ subscribers can enjoy a collection classic movies, with over two dozen that were just added in April. The list includes three John Wick films, the original Ghostbusters, a pair of Transformers movies and more.
There were already a selection classic movies on the streaming service, plus all the content Apple TV+ itself produced.
Timeless entertainment for you to enjoy
Almost every film or TV show that appears on Apple TV+ was created just for the streaming service. But it’s not 100% Apple original programs — the service also makes additions of classic movies. It’s at least a partial answer to criticism that TV+ has a smaller library than most of its rivals. (Small but top quality.)
Recently, a new slate of movies has appeared on Apple TV+ at the beginning of each month. And if there’s one on the list you’d like to watch, don’t dawdle: they are individually labeled as staying on the series through the end of April or May.
But there are sometimes holdovers. That’s true for a collection of 27 that was already on the service that includes three Star Trek movies, Mad Max: Fury Road, 200 and Minority Report.
All the classic movies added to Apple TV+ in April:
42
Anchorman: the Legend of Ron Burgundy
Armageddon
Arrival
Bridesmaids
Bridge of Spies
Clueless
Contagion
Crazy Rich Asians
Crazy Stupid Love
Dunkirk
Forrest Gump
Free State of Jones
Ghostbusters
Inception
John Wick
John Wick Chapter 2
John Wick 3
Mission Impossible: 4
Sherlock Holmes
Taken
The Departed
The Godfather
The Godfather Part II
The Heat
The Italian Job
The Town
Transformers
Transformers: Dark of the Moon
There were 29 classic movies added to Apple TV+ in April 2024. Screenshot: Apple TV+
Watch these films now on Apple TV+
Watching any of these classic films comes with a subscription to Apple TV+. The service is $9.99 per month with a seven-day free trial. You can also get it via any tier of the Apple One subscription bundle.
And Apple’s streaming video service also includes much more, of course. Apple produces an array of generally highly rated movies and series of its own. There are comedies, musicals, children’s shows, nature documentaries, etc.
Last September, while completing a grant application, I faltered at a section labelled ‘summary of progress’. This section, written in a narrative style, was meant to tell reviewers about who I was and why I should be funded. Among other things, it needed to outline any family leave I’d taken; to spell out why my budget was reasonable, given my past funding; and to include any broad ‘activities, contributions and impacts’ that would support the application.
How could I sensibly combine an acknowledgement of two maternity leaves with a description of my engagement with open science and discuss why I was worthy of the funding I’d requested? There was no indication of the criteria reviewers would use to evaluate what I wrote. I was at a loss.
Bring PhD assessment into the twenty-first century
When my application was rejected in January, the reviewers didn’t comment on my narrative summary. Yet they did mention my publication record, part of the conventional academic CV that I was also required to submit. So I’m still none the wiser as to how the summary was judged — or if it was considered at all.
As co-chair of the Declaration On Research Assessment (DORA) — a global initiative that aims to improve how research is evaluated — I firmly believe in using narrative reflections for job applications, promotions and funding. Narratives make space for broad research impacts, from diversity, equity and inclusion efforts to educational outreach, which are hard to include in typical CVs. But I hear stories like mine time and again. The academic community is attempting, in good faith, to move away from narrow assessment metrics such as publications in high-impact journals. But institutes are struggling to create workable narrative assessments, and researchers struggling to write them.
The problem arises because new research assessment systems are not being planned and implemented properly. This must change. Researchers need explicit evaluation criteria that help them to write narratives by spelling out how different aspects of the text will be weighted and judged.
Research communities must be involved in designing these criteria. All too often, researchers tell me about assessment systems being imposed from the top down, with no consultation. This risks these new systems being no better than those they are replacing.
How to boost your research: take a sabbatical in policy
Assessments should be mission-driven and open to change over time. For example, if an institute wants to increase awareness and implementation of open science, its assessments of which researchers should be promoted could reward those who have undertaken relevant training or implemented practices such as data sharing. As open science becomes more mainstream, assessments could reduce the weight given to such practices.
The value of different research outputs will vary between fields, institutes and countries. Funding bodies in Canada, where I work, might favour grants that prioritize Indigenous engagement and perspectives in research — a key focus of diversity, equity and inclusion efforts in the Canadian scientific community. But the same will not apply in all countries.
Organizations must understand that reform can’t be done well on the cheap. They should invest in implementation scientists, who are trained to investigate the factors that stop new initiatives succeeding and find ways to overcome them. These experts can help to get input from the research community, and to bring broad perspectives together into a coherent assessment framework.
Some might argue that it would be better for cash-strapped research organizations to rework existing assessments to suit their needs rather than spend money on experts to develop a new one. Yes, sharing resources and experiences is often useful. But because each research community is unique, copying a template is unlikely to produce a useful assessment. DORA is creating tools to help. One is Reformscape (see go.nature.com/4ab8aky) — an organized database of mini case studies that highlight progress in research reform, including policies and sample CVs that can be adapted for use in fresh settings. This will allow institutions to build on existing successes.
The postdoc experience is broken. Funders such as the NIH must help to reimagine it
Crucially, implementation scientists are also well placed to audit how a new system is doing, and to make iterative changes. No research evaluation system will work perfectly at first — organizations must commit sustained resources to monitoring and improving it.
The Luxembourg National Research Fund (FNR) shows the value of this iterative approach. In 2021, it began requesting a narrative CV for funding applications, rather than a CV made up of the usual list of affiliations and publications. Since then, it has been studying how well this system works. It has had mostly positive feedback, but researchers in some fields are less satisfied, and there is evidence that institutes aren’t providing all researchers with the guidance they need to complete the narrative CV. In response, the FNR is now investigating how to adapt the CV to better serve its communities.
Each institution has its own work to do, if academia is truly to reform research assessment. Those institutions that drag their feet are sending a message that they are prepared to continue supporting a flawed system that wastes research time and investment.
Competing Interests
K.C. is the co-chair of DORA (Declaration On Research Assessment) — this in an unpaid role.
The future of mobile malware is here. For the first time, cybercriminals are infiltrating iOS and Android devices and stealing user face scans. Then, armed with the power of deepfakes and AI, they’re replicating the user’s likeness to break into their bank accounts.
Yes, you read that correctly. Today’s technology allows bad actors to spoof biometric safeguards and hijack your face. This hack is as novel as it is terrifying – and it warrants immediate action from enterprises and users alike.
The arrival of deepfake hacking
This is truly a brave new world of hacking. Believed to be developed by the Chinese-speaking crime group GoldFactory, this hack uses fake apps to trick users into performing biometric verification checks. Unwittingly, users then share the facial scans required to bypass the same checks employed by legitimate banking apps in Asia Pacific.
The hackers do this by – and here’s the real innovation – using AI-powered face-swapping platforms. With biometric data in hand, as well as the ability to intercept 2FA text messages, these cybercriminals create deepfake replicas of their victims, enabling unauthorized access to their banking accounts. The result is an app scam that researchers have never seen before.
In a way, this hack reminds me of Cherryblos, another threat I wrote about in November that uses mobile malware to extract passwords and sensitive information from images. Now, it seems, hackers are turning their efforts from static images to user faces.
Unfortunately, it’s understandable why hackers are going down this route. Facial biometrics are one of the most popular mobile access methods and are used at least once a day with an app by more than 130 million Americans. Up until this point, facial biometrics have been seen as a trusted alternative to passwords. The authentication method is quick, convenient, and difficult to falsify. This cunning attack shows that it’s indeed tough to crack – but not impossible.
Apu Pavithran
Founder and CEO, Hexnode.
Enterprises must fight fire with fire
This hack is only currently active in a specific region and a specific app vertical but don’t be fooled – it’s a sign of threats to come. The entry barrier for AI and deepfake technology is low and any hacking actor with a semblance of budget and know-how is looking at this case for inspiration. For enterprises, this means fighting fire with fire and building robust mobile malware and biometric identification protections today.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
This starts with getting a grip on the apps in your ecosystem. A good way to do this is by creating a custom “store” with approved apps for corporate endpoints. Think of it like your own Play Store or App Store. Here, you can also modify permissions to regulate the level of control the app has over target devices and remove anything resembling risky behavior. It’s also vital to have strict cybersecurity criteria when inspecting which apps do or don’t make your store. If something doesn’t meet your standards, blacklist it.
Next, adhere to best practices to combat mobile malware, beginning with maintaining up-to-date devices through effective patch management. Enable auto-updates, install updates promptly upon release, and automate software modifications outside of business hours. Similarly, prioritize security scans and device monitoring. Deploy a user session monitoring system to identify malware and block suspicious sessions before users share any personal data.
Finally, watch out for the telltale signs of malware infection. This includes things like device battery drain, unusual data storage, slow performance, and strange behavior. Regular audits with a unified endpoint protection software platform can help to uncover these device malfunctions. Additionally, so can another enterprise resource: employees.
Spotting and stopping social engineering attacks
In the “face” of this new threat – excuse the pun – employees are arguably the most important cybersecurity element for enterprises to get right. Why? Because social engineering is malware’s main infection avenue and this case is no different.
This hack isn’t capitalizing on Android or iOS vulnerabilities. Rather, for this facejacking malware to work, the victim must authorize relevant permissions, therefore requiring a multi-stage social engineering strategy to gain entry into the phone.
This point is worth repeating. The malware cannot rip official biometric data on Android or iOS since this information is – rightfully – encrypted and kept separate from running apps. The entire hack relies on tricking the user. Only once invited inside the device can the trojan horse then read incoming SMS messages, control background functions, and request to capture the victim’s face.
Now more than ever, users must understand how to stay safe for their own good and that of the enterprise. IT must spearhead cyber hygiene initiatives and instruct employees to avoid clicking suspicious links, use company-approved apps, and report device problems like failed software updates or irregular performance.
And, in this evermore complicated landscape of cybersecurity and AI, remind users to think twice whenever an app asks for a face scan.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
● Google Podcasts account ● A smartphone ● YouTube Music installed ● Or third-party podcast platform installed ● Internet connection
On April 2, Google Podcasts will officially shut down for good in the United States, forcing its millions users to look for a new platform. This day has been a long time coming as the tech giant made the initial announcement of the shutdown back in September 2023.
YouTube Music will take its place, and the company has rolled out multiple updates turning it into Google’s main platform for audio content. A ‘podcast’ section was added, along with thumbnails, timestamps, and playlists. Other features include UI changes to help in the discovery of shows on the site, plus the ability to switch between audio podcasts and video seamlessly.
To help with the transition, Google is giving users a way to migrate their podcast data over to YouTube Music or supporting third-party services, although the latter can be a complicated process.
We recommend sticking with YouTube Music, since that’s the easiest transition. However, you have until July 2024 to transfer your data over to another platform, according to a Google support page. After that, you’re out of luck.
Quick steps for how transfer your data over to YouTube Music
Go to Google Podcasts
Export subscriptions
Install YouTube Music
Transfer data
Step-by-step guide detailing how to transfer data over to YouTube Music
Launch the Google Podcasts app and tap Export Subscriptions at the top.
Select the Export button underneath the Export to YouTube Music option on the following window. Your device will then open YouTube Music on a page giving you the option to transfer data with a brief descrition.
Tap the Transfer button to move forward. If you don’t have YouTube Music on your smartphone, you’ll be instructed to install the app.
(Image credit: Future)
A small box with a disclosure will rise from the bottom. Hit ‘Continue’ when it shows up. Data transferring may take several minutes to complete. The time it takes depends on how many subscriptions you have.
Transferring to a third-party service
Like before, tap Export Subscriptions on Google Podcasts, but this time, select Download under the Export for Another App section. Doing so will create an ‘OPML’ file that you’ll have to upload to a third-party platform.
The process of uploading an OPML file differs from app to app. First, you’ll have to find an app that actively supports the format, such as Podcast Addicts, Cast Box, or Pocket Casts. We recommend downloading the third option as Pocket Casts arguably has the cleanest-looking interface, with an import tool that appears right after it opens.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
(Image credit: Future)
Hitting Import at the top will take you right to your mobile device’s Downloads folder, where the OPML file can be found. From here, all you have to do is upload that file.
Again, if you want the most straightforward method, stick with the YouTube Music migration. A YouTube representative told us that Google Podcasts will be shutting down “for all global users this year,” however they couldn’t give an exact time frame of when this will occur.