Meta is expanding its paid verification service for businesses, adding three new tiers to the program that offers extra perks to companies willing to pay a monthly subscription. The company began testing the service, called Meta Verified, with businesses after rolling out a paid verification for individuals.
With the new plans, which are coming first to Australia, and New Zealand, Meta is offering a much wider range of services to business owners that rely on its platform. Under the new structure, the basic “standard” plan is $14.99/month. It offers a verification badge, higher ranking in search, impersonation protection, the ability to add links to images and access to customer support. (Each subscription covers a single Facebook or Instagram account, the program is expected to roll out to WhatsApp “soon.”)
While that base plan is now the same price for businesses as it is for individuals, companies will pay a hefty premium for the extra perks. There are three additional tiers for business owners to choose from: the $44.99/month “plus” plan, the $119.99 “premium” plan and $349.99/month “max” plan. Each of these includes additions like the ability to add links to a Reels posts, fast-tracked customer support and more profile customization options.
Meta
The most expensive plan also expands impersonation protection to up to five employees as well as extra customer service perks. It includes a semiannual “account review,” which will consist of “personalized guidance on their content strategy.” And it allows account owners to request a phone call from a Meta customer service representative for help with account issues and other problems.
During a briefing with reporters, Meta’s VP of new monetization experiences Pratiti Raychoudhury said the expansion of Meta Verified is meant “to meet businesses where they are in their journey on our apps.” She said Meta will continue to tweak its offerings as more companies sign up for verification.
Virtual desktop infrastructure (VDI) is a technology that enables businesses to use virtual machines instead of being confined to a physical workstation. The virtual machines are hosted and managed in a data center, while the users can access them remotely from their workplace.
VDI environments are hosted on a centralized server with substantial processing power. Virtual desktop images are delivered over the Internet to an endpoint device, allowing users to interact with the operating system as if they were using the device locally.
Companies use VDIs because of their flexibility and scalability. With VDI, employees can access their work desktops and applications remotely via any device. This allows people to work from anywhere instead of being confined to a physical location to access the computing power required for their corporate tasks.
How Does VDI Work?
To enable VDI, a hypervisor software segments a centralized server into different virtual machines that users can access remotely from their devices.
Let’s see a fitting example:
Company A has 100 programmers who each need sophisticated PCs to perform their work. The company has two options; buy individual PCs and send them to each programmer or use a centralized server and allow each programmer to access virtual machines, i.e., VDI technology.
Company A chooses the VDI option because it is easier to manage one server and allocate virtual machines to each machine than manage 100 individual PCs connected to a single corporate network. The centralized server also requires less maintenance than 100 separate PCs.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Hypervisor software divides the computing resources from the centralized server into individual virtual machines that the programmers can utilize at will. Hence, they have access to sufficient computing power to write and test code effectively. Every programmer connects to their virtual machines via a connection broker, a digital gateway acting as an intermediary between the end user and the centralized server.
Persistent VDI vs. Non-Persistent VDI?
There are two types of VDI: Persistent and Non-persistent.
In Persistent VDI, users have a separate desktop image, allowing them to save changes and permanently install apps. Each user’s virtual machine can have personalized settings, such as passwords, screensavers, and shortcuts.
In Non-Persistent VDI, every user gets a fresh desktop image after logging in, which expires when they log out. Users can not have personalized settings or save files to their desktop images in this VDI type.
Persistent VDI has a 1:1 ratio, while Non-Persistent has a many:1 ratio. The former is used in organizations where employees require a separate desktop image to work effectively. In contrast, the latter is used in companies with large workforces performing repetitive tasks that don’t need a customized desktop for each worker.
VDI Use Cases
Highly-regulated sectors
Companies operating in highly regulated industries like healthcare and defense often adopt VDI technology. They do this because VDI enables them to centralize all data in a secure server and prevent malicious actors from stealing it. This technology enables such companies to easily comply with strict data secrecy regulations.
Task and shift work
VDI is an excellent fit for companies with a large workforce using the same tools to perform repetitive tasks, such as call center agents. There’s no need to issue each worker a separate device when they don’t need to save anything or use customized settings to perform their jobs. Instead, a Non-persistent VDI does the trick, allowing each worker to access a fresh virtual desktop during their shift.
Bring your own device (BYOD)
VDI technology offers an ideal solution to maintain security while allowing employees to use personal devices for work. In this case, employees use their personal devices to access a virtual desktop when it’s time to work. The virtual desktop acts as a separate machine they can’t tamper with because corporate data remains on the centralized server, not the personal devices.
(Image credit: Getty Images)
Advantages of VDI
Cost-effectiveness
It’s more cost-effective for a company to operate a centralized server and allocate computing resources to each employee than to maintain separate devices for them. With VDI, employees can access their virtual desktops from older PCs and laptops, reducing the need to frequently spend money on new hardware.
Security
With VDI, your corporate data stays on the server instead of an employee’s end device. This reduces the chances of a rogue employee stealing data or your data being stolen if an employee’s device is compromised by malware.
Easy management
VDI makes it easier for IT teams to manage hardware and software resources. They only need to maintain one server rather than watching over dozens to hundreds of individual PCs. A technician can deploy any software update on the server at a button’s click, and the update automatically reflects on all virtual desktops from that server.
Remote access
With VDI, employees can connect to virtual desktops from any location. They can access all their files and applications with an internet connection, enabling your company to maintain a productive remote workforce.
VDI Drawbacks
VDI technology has some drawbacks, including
Absolute reliance on Internet connectivity
VDI relies solely on internet connectivity. A slow internet network means no VDI connection. This can pose a problem if an employee is temporarily in a position with no online access or a very slow one.
Complex infrastructure
VDI requires complicated infrastructure to work seamlessly. It involves many computing components working flawlessly to allow users to access virtual desktops. Any little mistake can disrupt the whole system and send your IT team into a frenzy.
Additional staffing
You may need to hire additional IT staff because of the complexity of managing large-scale virtual desktop infrastructure. VDI requires regular monitoring and updates, and you need staff to train employees about using virtual machines. Your company may also need outside consultants for the initial VDI setup.
User experience issues
Using VDIs is not as smooth as using a separate PC. Most people struggle to understand virtual machines without sufficient training, which results in a poor user experience and affects their productivity. However, you can counter this drawback by providing your staff with your staff with high-quality training and IT support, helping them get familiar with VDIs in the long run.
VDI vs DaaS
Businesses have two mechanisms for delivering virtual machines to their users: virtual desktop infrastructure (VDI) or desktop as a service (DaaS). But what’s the difference between both technologies?
The difference lies in who owns and operates the desktop infrastructure. In VDI, the company creates and manages the underlying VDI servers, giving it complete control over the server settings.
In DaaS, the third-party provider owns and operates the underlying servers. The business rents the server infrastructure from the operator, meaning it doesn’t always have complete control over the configurations. Here, the company has to make do with what its DaaS provider offers.
The initial setup for a VDI is expensive because the company needs servers and other networking hardware and software to enable them to create virtual machines from the central server. On the other hand, DaaS is more affordable because the company ‘rents’ the server space from a third party that already paid for the initial setup.
VDI and DaaS have their use cases. VDI is preferable for large enterprises that can afford sizeable upfront costs to save money in the long run. DaaS is preferable for small businesses that can’t afford high setup costs and need the flexibility to spin up or dispose of computing resources at will.
In other words, large companies choose VDI as a long-term investment to get complete control of their virtual machines, while small companies choose DaaS as a short-term solution to access needed virtual machines. DaaS is basically VDI delivered as a cloud-based solution.
Implementing VDI at your Organization
Setting up VDI requires considerable planning. These are the best practices to imbibe when setting up this system:
Understand the end-user requirements
The first step is to get a complete picture of what the system’s end users need—it’s often not what the IT department initially thinks. What applications do they need to access? At what time of day do they use computing resources most? How many people need to connect to the centralized server at a time? Do the end users need access to high-end computing power to perform their jobs?
The above questions help you build a rigid VDI for your organization. They tell you whether to choose Persistent or Non-persistent VDI and the type of server to buy. If your staff needs extensive computing power, you’ll need high-end servers to deliver this capability.
High availability
Your VDI doesn’t end at the initial setup – you must ensure that the servers can be accessed whenever needed. Any downtime hinders productivity and costs money, so you want to avoid this as much as possible. Ensure that you have enough servers to meet demand and have redundant hardware to keep up with unusual demand spikes.
Pilot testing
Run a pilot test of your VDI setup before mass deployment. Run a test for a select number of users to see how the system performs – users should provide candid feedback to the IT team so that they can fix any issues.
The pilot test is the best time to look for bugs and weaknesses in the VDI setup. You shouldn’t wait until mass deployment when little bugs can disrupt organizational workflow.
Security
Ensure your VDI has the appropriate cybersecurity solutions to protect it from malicious attacks. Security risks are high when people connect remotely from different devices. Hence, you need these cybersecurity features:
Multi-factor authentication to prevent unauthorized parties from breaching your VDI.
Audit logs to monitor who accesses the system and trace the source of any fault.
Endpoint security software to protect servers from malware and other malicious attacks.
Application restriction to control what kind of software users can access on their virtual machines.
Continuous monitoring
You must continuously monitor your VDI servers to ensure they perform as expected. Monitoring tools give you deep insights into your servers, letting you know when anyone goes down or needs troubleshooting. The monitoring software sends alerts for any server issue so that you can react swiftly.
It’s also important to upgrade your server resources frequently to keep up with increasing demand. Always add more storage and processing equipment when needed.
Popular VDI Providers
Here are two good examples of VDI/DaaS infrastructure providers:
Azure Virtual Desktop
(Image credit: Azure)
VMAzure is a cloud-based VDI solution offered by Microsoft. With this service, you get access to powerful servers running Windows software. You can create virtual machines running Windows 10 or 11 for your employees, allowing them to work from anywhere like they’d do on a regular PC.
Despite being cloud-based, Azure gives you complete control over your server and virtual machine configurations. You only pay for the computing resources your organization uses, making it a cost-effective solution in the long term.
Microsoft Azure has one of the biggest and most resilient server infrastructures, with 300+ physical data centers worldwide. With Azure Virtual Desktop, you shouldn’t worry about accessing the virtual desktop infrastructure needed to keep your organization productive.
VMware Horizon
(Image credit: VMware)
VMware pioneered the modern virtual desktop infrastructure system and now offers one of the best solutions in this sector. VMware Horizon is a commercial solution that gives enterprises access to secure, high-performing virtual machines without stress.
With VMware Horizon, you can spin up or dispose of virtual machines at will. VMware’s robust infrastructure ensures that your organization can access the computing resources it needs at any time, with a centralized dashboard to manage the virtual machines. This platform gives deep insight into your VDI stack, allowing you to continuously monitor performance and make adjustments to keep the stack resilient.
Final Words
We have explained the most essential aspects of VDI: how it works, use cases, pros and cons, and best practices for implementing it. A VDI helps your organization access computing resources cost-effectively and securely. Follow our tips, and you’ll likely set up a robust VDI that end users will enjoy.
A special Today at Apple series by and for small-business owners is coming in May to select Apple Stores, the iPhone giant said Wednesday. New “Made for Business” sessions will offer small business owners and entrepreneurs free opportunities to learn how Apple products and services can support them.
“At Apple, we know small businesses are the backbone of local communities, which is why we are constantly innovating to help at every stage of their growth,” said Deirdre O’Brien, Apple’s senior vice president of Retail.
“Our retail stores provide only-at-Apple experiences such as community and education sessions, free Today at Apple programming, and ongoing support from in-store experts who help small businesses find the perfect technology to supercharge their work,” she added.
‘Made for Business’ Today at Apple sessions launch in May
The new programs start up during National Small Business Week in the United States. Six new Made for Business Today at Apple sessions will run throughout May in Chicago, Miami, New York, San Francisco and Washington, D.C. After that, programming in will run in select stores worldwide throughout the year.
Here’s how Apple describes the new Made for Business Today at Apple programming:
Led by small business owners, the sessions will highlight how these organizations have used Apple products such as iPhone, iPad, and Mac — along with resources such as Apple Business Connect, Apple Business Essentials, and Tap to Pay on iPhone — to build their businesses, reach customers in new ways, and push their organizations forward.
So having small business owners lead the sessions lets them teach other entrepreneurs how actual businesses have used Apple products and resources to reach customers in new ways and more, Apple said.
New Today at Apple business sessions
Business owners teaching business owners: It builds business like nobody’s business. Photo: Apple
An example of a business leading a session comes with Washington, D.C.-based Mozzeria, a Deaf-owned pizzeria. It was “founded with a mission to provide customers with a warm, memorable and visually captivating experience of Deaf culture,” Apple said.
Theodore Miller, the pizzeria’s director of National Operations, will lead a session at Apple Carnegie Library. He will show how the restaurant’s staff uses Apple’s accessibility features to assist clientele and build business.
“To build a truly inclusive and community-driven business, we must focus on putting people first. That means adapting our technology and practices to be more accessible. Apple’s innovations have been key in helping us boost efficiency and connect with customers,” said Miller.
“Whether it’s using Dictation on iPhone or iPad for speech to text in the Notes app, or enabling Live Captions for phone calls, Apple’s tools help bridge communication gaps and set higher standards for businesses in today’s fast-paced world,” he added.
Other resources for businesses
Mozzeria will lead a Made for Business session at Apple Carnegie Library showcasing how the restaurant’s staff makes use of Apple’s accessibility features. Photo: Apple
Apple also noted its other resources for businesses in Apple Store locations. Teams called Business Pros and Business Experts stand by to help. Here’s more information from the iPhone giant:
Whether a business owner is looking to learn which products and services are right for their team, or interested in expanding their use of Apple’s tools, Business Pros can help curate personalized solutions, facilitate easy purchasing and shipping, and help small businesses get set up with Apple resources:
Apple Business Connect, a free tool allowing businesses of all sizes to customize how they appear to more than a billion Apple users across Apple Maps, Messages, Wallet, Siri, and other apps.
With Business Connect, businesses can directly manage their information in the interactive Apple Maps place card, including creating Custom Action Links that direct users to their website or preferred platform and make it easy for customers to place orders, reserve a table, and more, right from the place card.
Apple Business Essentials, one complete subscription that seamlessly brings together device management, 24/7 support, and cloud storage. Business owners can easily manage the Apple devices in their organizations and scale up as they grow.
Tap to Pay on iPhone, which provides businesses with an easy, secure, and private way to accept in-person contactless payments. This includes contactless debit and credit cards, Apple Pay, and other digital wallets, using only an iPhone and a partner-enabled iOS app — no additional hardware or payment terminal needed. Tap to Pay on iPhone uses the built-in features of iPhone to keep business and customer data private and secure.
To get started with Today at Apple sessions, attendees can sign up at apple.com/today/groups.
Apple says it sees its Vision Pro headset as an essential addition to every business and is pushing for more enterprises to adopt the firm’s advanced wearable tech.
While spatial computing certainly has a role in business, allowing firms to create experiences that were, until now, impossible, or at least, unrealistic, many firms remain unconvinced of the benefits, which is a view that Apple wants to change.
The company says enterprise developers are making headway in several categories, from productivity and product design to immersive training and guided work. By using Vision Pro, companies can reinvent workspace environments while enhancing everyday productivity.
Enhancing everyday productivity
The tech behemoth is keen to stress the many uses its wearable tech can be put to, from allowing users to easily access and manipulate their data using popular productivity apps like Microsoft 365 and SAP Analytics Cloud, to streamlining business decision-making with real-time access to data dashboards, 3D maps, and graphics.
“There’s tremendous opportunity for businesses to reimagine what’s possible using Apple Vision Pro at work,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations and Enterprise Marketing.
“We’re thrilled to see the innovative ways organizations are already using Apple Vision Pro, from planning fire response operations to iterating on the most intricate details of an engine design – and this is just the beginning. Combined with enterprise-grade capabilities like mobile device management built into visionOS, we believe spatial computing has the potential to revolutionize industries around the world.”
Through data visualization on the Vision Pro, businesses can also explore advanced design and collaboration opportunities and aid in training procedures. Apple says guided work, which often requires a comprehensive understanding of a person’s surroundings, is another area where the Vision Pro comes in handy. The wearable technology could also provide invaluable assistance in emergency response, maintenance and repairs, utilities management, or even construction projects.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Despite the myriad benefits Apple is touting, many firms still need convincing that Vision Pro will indeed enhance everyday productivity and not merely offer a different – and perhaps less efficient – way of accomplishing standard tasks. This is the biggest challenge Apple must overcome if it is to secure broader adoption of its headset.
Imagine a world where knowledge, data, insights, breakthroughs aren’t hoarded but shared, with a purpose: to tackle the monumental challenges facing our communities, societies and planet. This isn’t a utopian ideal, but a technical reality, and more importantly, moral imperative today.
Debates that were once the domain of Ivory Tower academics are front and center in the boardrooms of organizations working in energy, defense, transportation and healthcare. Stung by years of instability, volatility, global crises, conflict, and tension, companies are desperate to increase resilience, efficiency and competitiveness. There is growing evidence of individuals within enterprises keen to work cooperatively in ecosystems, who see the vastness of the challenges their businesses face and know that it is only possible to meet those challenges in partnership with others.
But enthusiastic individuals have been struggling against corporate and political inertia. In the midst of the pandemic, the World Economic Forum highlighted a lack of will at the highest levels to collaborate across geographies, across boundaries. Adoption of initiatives to enable collaboration, desire for consensus, and expectation of standardization are slowed by self interest, and perceived commercial realities.
Ali Nicholl
Founding Member and Head of Engagement at IOTICS.
Growing understanding
Despite a growing understanding that sharing information leads to better outcomes for customers and companies, and usually as a result benefits people’s lives and the planet, some industries are slower than others when it comes to being open to sharing data in a mixed trust or zero trust environment. The orthodoxy says that the solution is to convince the majority of the rightness of a course of action, and get agreement on how to act, and then to implement change.
That isn’t how you change the world. The sad reality is that in a global market slow and steady doesn’t win a race, I’d argue it never has, but it certainly won’t if the race track is crumbling around you. Look at the healthcare industry, for example. Organizations here are dealing with more problems in more complex ways that are more expensive than ever before. But these problems are approached with 20th century budgets and ideas. As a result, many healthcare systems are at breaking point.
Greater demands
In every aspect of our life we are seeing greater and greater demands on our resources, personal, societal and governmental. One of the challenges that the utilities sector faces is accurately identifying and supporting customers living with vulnerabilities. Despite mandates from regulators around data sharing, progress has been slow and many vulnerable customers face potentially dangerous situations. Utilities companies hold information on customers that, if available securely across an ecosystem, could allow emergency help to be directed to those who need it most in the event of a storm or flood. Sharing data could also help provide tailored extra support by being more flexible around payments.
In defense, despite the heightened geopolitical risk, encapsulated by the notion we are now on a pre-war footing, the UK armed forces are facing an equipment funding shortfall of £17bn over the next 10 years. Civil and military aerospace and industrial manufacturers are confronting supply chain disruption, spiraling prices, and skills shortages. We simply do not have the runway for business as usual. Siloed effort, and worse repeated reinvention of effort, are wasting resources at every level of the supply chain.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Technology-enabled cooperation can inspire and accelerate societal change for a thriving planet. New technologies, approaches, and capabilities are being developed by brilliant individuals and organisations all the time, but working together and leveraging expertise from across sectors, geographies and organisations can provide sustainable solutions. The automotive industry is investing in data spaces to enable interoperable supply chains from rare earth materials to on-the-road innovation. Embedding resilience, increasing adaptability and enhancing the value in their individual expertise and specialisms.
Data has become our most valuable commodity, but it is often siloed within organizations due to technical, regulatory or commercial barriers. Concerns around intellectual property (IP), commercial asset & secrets, and cybersecurity are held up as the barrier to working and thriving in ecosystems. Technologies now exist that can mitigate these risks, enabling sovereignty of data, and secure cooperation without exposure to commercially sensitive or individually identifiable information.
Exploring different models
Despite these innovations, reticence remains, and there is still an unwillingness to explore different models and new approaches. I passionately believe that to delay further is immoral. To know that it is possible to address climate change, to increase climate resilience, to improve social inequality, to protect the most vulnerable in our society and ensure that no one is left behind as we transition to new technologies, energy and models and to not act is now a choice. To hide behind initiatives that are little more than talking shops, sustainability approaches that don’t address the underlying wastefulness and roadmaps for change that won’t see impact for decades, if at all, is a dereliction of leadership.
It is also a disservice to shareholders. I accept the commercial realities we all work within. So let’s remove the moral and speak to selfish altruism. A recent KPMG report stated ‘Every day, organizations deal with increasing amounts of data, originating from both inside and outside the organization. Due to centralized data ownership and data access, transparency and timeliness have become bottlenecks in many organizations’.’ Those bottlenecks cost money, they represent inability to bring capabilities to market, to seize opportunities and to maximize profits. Inability to cooperate in ecosystems limits export markets, diminishes resilience, and prevents the innovation and adaptation necessary to capitalize on new technologies.
Unwillingness to grasp the imperative of rapidly exploring, leveraging and exchanging data, knowledge and systems across ecosystems, is hurting your businesses. It is limiting the upside of your strategy and leaving you vulnerable to the symptoms and consequences of the generational challenges that we face. There might be a moral imperative for data sharing. But there’s a commercial one too.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Businesses across different industries are increasingly recognizing the potential of AI to revolutionize their operations, create new products and services, and gain a competitive edge. However, no business is the same and navigating the complexities of AI adoption can be daunting. Here are five essential steps businesses can take to successfully integrate AI into their operations in 2024 and unlock its transformative potential.
Be mindful of the AI hype – focus on business needs and goals…and whether AI can deliver today or not.
Experimenting is great, but only with purpose. Before diving headfirst into the world of AI, it’s crucial to ask yourself “What challenges does my business face? What opportunities can AI address?” Your purpose could be enhancing customer experience, streamlining internal processes, or optimizing decision-making, for example.
Clearly defined goals provide a roadmap for your AI journey. It also helps you avoid the tempting trap of adopting AI for the sake of it. Focus on specific needs and align your AI initiatives with your overall business strategy.
Keep in mind, as well, that AI is still in its formative stage – it can’t do everything we hope for today. It is imperative to gain a solid upfront understanding of AI’s capabilities, mapped against your objectives, to avoid investing in something that, ultimately, isn’t possible just yet.
Edward Funnekotter
Chief Architect and AI Officer, Solace.
Quality and quantity – build a data-driven foundation
The quality and accessibility of your data directly impacts the effectiveness and accuracy of your AI models. This is where robust data governance practices and integration solutions come into play. Data silos are the enemy of AI. They stop AI from being able to learn, evolve and provide meaningful and valuable insights for your business. Breaking down silos requires organizations to invest in data governance practices and data integration solutions as a priority. Another important aspect is implementing tools that ensure clean, consistent, and readily available data for your AI applications. Quality data is the fuel that powers quality AI.
Embrace actual real-time technology
Traditional data architectures often struggle to keep pace with the real-time demands of AI. This is why it’s essential to embrace event mesh technology, a proven approach to distributed networks that enable real-time data sharing and processing. By adopting event-driven architecture (EDA), businesses unlock a new realm of real-time AI which allows it to react quickly to events, trigger automated actions, and make decisions based on the latest information.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
This approach to AI helps businesses deliver more personalized experiences such as real-time recommendations, offers, and support based on individual needs. It could also allow for predictive maintenance, anticipating problems or failures ahead of time to allocate the correct resources to the right place in the right time. With EDA as a central nervous system for your data, not only is it possible for AI to operate in real-time, but adding new AI agents becomes significantly easier.
Engineer your bespoke AI platform
Developing AI applications can be slow, hindering their potential value. Platform engineering can act as the much-needed accelerator. This emerging trend has the intention of modernizing enterprise software delivery, particularly for digital transformation. Additionally, it optimizes the developer experience and accelerates product teams’ delivery of customer value. These platforms will allow developers to gain access to automated IT infrastructure management, pre-configured tools, and pre-built components, allowing them to focus on what truly matters: building innovative AI solutions faster.
The overarching focus is on streamlining infrastructure, automating tasks, and providing pre-built components for developers. However, such applications will be only hypothetical without the ability to design and develop them in the first place. This is why it is important and promising to note that Gartner sees Platform Engineering coming of age in 2024.
Break the shackles of legacy systems
Amid the rush to real-time and AI-driven operations, large, disparate organizations will still be limited in their ability to achieve optimal business value because of their reliance on a complex mix of legacy and/or siloed systems. Last year, IDC found that only 12% of organizations connect customer data across departments.
The AI data rush will drive a greater industry-wide need for event-driven integration, but only with an enterprise architecture pattern will systems new and old be able to work together. Without it, you won’t offer seamless, real-time digital experiences, linking events across departments, locations, on-premises systems, IoT devices, in a cloud or even multi-cloud environment.
AI adoption must not chase the latest trends but focus on making strategic investments that deliver tangible business value. By prioritising your business and following the steps listed above, you can unlock the transformative potential of AI and propel your business forward in the data-driven era. Remember, AI is a powerful tool, but its success hinges on careful planning, strategic implementation, and a clear understanding of your business goals.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Billed as a commercially safe AI image generator, iStock released its AI photo platform back in January 2024 – and during a live demo of the latest update, we spoke to Chief Product Officer Grant Farhall and Bill Bon, Director of Editing, about creative efficiencies, business-first AI, and what makes a good AI text-to-image prompt.
Famed for its stock media library, the company, owned by Getty Images, has been focused of late on creating a good, usable AI tool that’s accessible at pretty much every level of an organization. And a commercially safe one, too, untrained on copyrighted materials that might bring down unnecessary lawsuits on businesses big and small.
The latest additions to the toolkit, Refine and Expand, add an extra layer of creativity. We took a look at how they worked while discussing iStock’s AI future.
You can check out the iStock Ai generator by clicking here
Finding creative efficiencies
Business may be the order of the day, but Farhall told us that iStock has made sure the tool was fun to use. Fun. It’s a word that’s repeated often during the live demo, feeling like a byword for simple, accessible, engaging. And it’s hard to deny that using certain AI art generators can feel like a slog, especially professional tools with their unintuitive or overly complex interfaces. In contrast, iStock is trying to balance exciting capabilities with usability – a factor the company feels gets overlooked in the rush for AI-everything. “It’s all about making lives easier,” said Farhall, “Letting users create in an elevated way.”
That’s evidenced from the start, with the platform employing a helpful prompt builder that guides you through the key information needed to generate AI images. Better prompts means better results.
Refine and Expand, the platform’s latest tools, are another example of this. The former lets users highlight areas of AI-generated images, then prompting the tool to add objects and people, or tweak elements of the image. In one example, we’re shown how easy it is to add a scarf to a photorealistic image of a man, just by selecting the neck area and telling the AI what to add. In another, a tiger was added to a jungle scene.
Expand, a tool favoured by designer Bon, intelligently adds to the background. Streets become longer, more populated, woods thicker – and, Bon explained, those expansions should make contextual sense. The social media advantages for time-poor designers and non-designers, where every platform demands different image dimensions, are clear.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The focus is on the ability to “expand what’s possible”, said Farhall, “finding creative efficiencies”. With AI, anything is possible.
Well, almost anything.
It’s rare in a live demo that we’re intentionally shown the limitations of a product. Here, what the tool can’t do is announced with the sort of fanfare usually reserved for massive updates, new features. The iStock AI Generator is, effectively, a safe – commercially safe – sandbox. In part the goal, said Farhall, was to prove you don’t need a mass dataset culled from every corner of the internet to train an AI; that it’s possible to provide a transparent process where creators are compensated and businesses can feel confident using the platform.
The AI generator can’t, for instance, generate existing logos or celebrities or copyrighted images. Nike swooshes, Taylor Swift, Mickey Mouse, the golden arches of McDonalds – as far as the iStock AI is concerned, these things simply do not exist. They’re not part of the training model, they’ll never be generated. Nor will a company’s AI generations get cycled back into the model. It’s a creative silo for businesses.
Since unveiling its tool, iStock has gone hard on legal business protections. From the beginning, it offered the same $10K legal coverage enjoyed by users of its pre-shot media library. The company hopes this will allow for seamless use between its AI and the stock media platforms.
That’s essential, Farhall explained, since the real issue with AI is that it’s always “looking back.” He cited the pandemic as an example, where photographers could be commissioned to get current images. Users, then, should get the best of both worlds. Expect closer integration between the two worlds, too. We’re told the ability to use the AI tool to edit and refine pre-shot is in the pipeline, and users will also soon be able to fine-tune AI images to create branded generations in May.
Before wrapping up, we ask veteran prompt writer Bon the secret behind what makes a good text-to-image prompt? He explained that it’s experimentation, but a good AI text prompt places important details at start. Make it long, 30-40 words using cinematic language, like whimsical and spooky, and include specifics like subjects and lighting style. And use image filters to make photoreal images more filmic. So, now we know.
With macroeconomic headwinds persisting in the wake of cutbacks for many UK businesses, it’s clear that the pressure on companies to save money is not going away. But organizations must be wary of the temptation to reduce investment in data technology and analysis, as they risk losing a crucial competitive advantage. With data analysis and artificial intelligence (AI) growing in importance, almost half of businesses (44%) plan to push through data modernization efforts in 2024, according to PwC. Over half of organizations therefore cannot afford to turn their backs on technologies which can deliver key business advantages, such as improved customer experiences and enhanced product innovations.
In the year ahead, the organizations that will be most effective at navigating the economic landscape will be those that focus on managing spend and increasing efficiency to drive better business outcomes. According to IDC, the world is producing more data than ever, as much as 181 zettabytes of data per year by 2025 or the capacity of 45 trillion data DVDs. Especially with the boom of generative AI, data will continue to be a key differentiator for those looking to capitalise on AI – the more diverse and comprehensive the data, the better AI can perform. For businesses to remain competitive, harnessing the power of data insights, along with effective cost management and planning must be front of mind for business leaders.
James Hall
UK Country Manager, Snowflake.
Business value and transparency
Achieving transparency on existing costs is the first step towards becoming data efficient. For data admins – someone responsible for processing data into a convenient data model – this means using their analytical skills to scrutinize existing workloads, allowing them to identify which are actually delivering valuable insights. From this point, they can take a view on whether to re-architecture, increase or decrease the usage of the workload, or even retire ones which are not delivering results. A full understanding of data lineage, including where data comes from and what happens to it, can also be a useful starting point to help establish cost controls, as well as pinpointing costly errors.
Business transparency must also derive from the SaaS vendor and platform they select to use when it comes to spend. This enables businesses to understand what they are investing in each workload and weigh this up against return on investments. Understanding per-query costs can highlight the most expensive queries and allow admins or IT leaders to rethink them in terms of rewriting or refactoring. Increased visibility and control of spend will provide businesses with the best chance of maximising existing resources.
Predicting future costs
Only when businesses get hold of their data costs can they truly begin to predict future costs, and implement measures to keep spending as efficiently as possible. Many legacy data platforms are highly inflexible, with fixed cost pricing and long-term vendor lock-in contracts, making it harder to implement changes when times are tough, or even when scaling back requirements during quieter periods of data analysis. Such tools often require complex, time-consuming capacity planning in order to keep control of data costs, which can ironically prove expensive in itself.
The costs of data processing, monitoring and control mechanisms cannot be an afterthought. Flexible scaling and consumption-based pricing models are a great way of avoiding unnecessary overprovisioning and paying for processing and storage that does not deliver for the business. A growing number of organizations are also choosing to set up budgets in advance, with spending limits, digital ‘guards’ against overspending, and daily alert notifications and warnings. This allows businesses to pinpoint where money is being spent, how much value it is delivering, and how it can be reined in.
Modern data platforms built in the cloud provide an intuitive UI to examine usage and usage trends, with clear dashboards visualizing which teams, customers and cost centers are responsible for the bulk of spending. Rather than waiting for spending to go over budget, companies can get ahead of the game and see when spending limits are projected to be exceeded. In the long run, this will help technical leaders and CFOs reduce operational costs through more efficient usage.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Tracking usage at a granular level — think account level, per user, or per task — will be a key differentiator. However, larger companies should also contemplate taking control at an organizational level. This can require restricting the actions of teams or individuals to perform credit-consuming resources, such as warehouse creation. Such capabilities also offer in-depth control over factors such as size and number of clusters, and offer granular control over when clusters are spun up, to help to control costs now and in the future. Per-job-cost attribution helps organizations manage department costs and maximize resources as they scale to more teams and jobs. Furthermore, auto-suspend and auto-resume capabilities can be enabled by default. This capability turns platforms off when they aren’t required, preventing paying for unnecessary usage and thus saving customers money.
Harnessing data, controlling costs
Even in tough economic times, organizations should not abandon ambitions to harness the power of data. For businesses in any sector, analyzing and understanding data has never been more important. The focus must instead shift towards changes that actually deliver results, such as moving from legacy on-premises platforms to modern SaaS data platforms that enable better transparency and planning on costs.
Doing so will have a massive impact and empower businesses to take control of their tech investments, which can be a key differentiator in today’s challenging macroeconomic landscape. Businesses should avoid taking the self-defeating, retrograde path of cutting back on their data usage, and should embrace the potential of modern data platforms to maximize cost efficiencies and control, while still forging a path into a data-driven future.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
The world is witnessing a significant shift in its labor market due to the increasing adoption of artificial intelligence, new research has claimed.
A report from LinkedIn has detailed a 9% surge in hiring AI technical talent over the past year in the UK, signaling a growing reliance on the emerging technology that has now become commonplace in many industries.
The shift is reflected across hiring trends and evolving skill sets, with studies suggesting that the skills needed for jobs in the UK specifically could change by as much as 65% by 2023, compared with 2016.
AI is transforming the labor market
The top AI skills now in demand across the UK include AI, machine learning, generative AI, deep learning and natural language processing, according to LinkedIn’s study. These skills are increasingly sought after across a range of sectors, with administrative and supportive services, professional services, technology, information and media, manufacturing, and financial services all asking for these increasingly important skills.
Janine Chamberlin, LinkedIn’s UK Country Manager, emphasized the necessity for companies to prioritize upskilling efforts: “Businesses that invest in helping their employees become AI-literate will benefit from productivity gains and have an edge over competitors.”
Chamberlain recommends testing publicly available tools like Microsoft Copilot and ChatGPT as a stepping stone to deploying more advanced, more customized AI apps.
At the same time, a separate Rackspace Technology study carried out in collaboration with AWS recorded that three in four (75%) IT decision-makers plan on investing between $0.5 and $5 million in AI this year. According to that report, between 36% and 48% credit AI with boosting their sales.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The report’s findings align with those of LinkedIn – according to Rackspace, 85% of respondents attempted to recruit people with AI/ML skills in the past year, and Rackspace’s predicted boost in investment comes at the right time, with LinkedIn finding that seven in 10 UK hiring managers predict that the skills gap will widen in the next five years.
Chamberlain summarized: “Those that [actively upskill employees] will help their workforce stay agile and build skills needed for the future of work.”
The surge in demand for large-scale generative AI models has led to a significant increase in hardware requirements, making model training costly and inaccessible for many SMBs and educational establishments.
High-performance custom PC builder Maingear has partnered with storage giant Phison on a new range of Maingear Pro AI workstations that boast powerful Intel Xeon W7-3455 CPUs.
The new workstations can be configured with up to 1TB of DDR5 memory, and up to 4x RTX 5000 Ada or 4x RTX 6000 Ada GPUs. These GPUs are supported by Phison aiDAPTIV+ caching SSDs and software, to significantly lower the cost of LLM development and training.
Off-the-shelf components
Maingear Pro AI workstations fit in a standard desktop tower PC design so they can easily be stored under a desk, or placed anywhere in an office (a 4U rackmount chassis is also available). Maingear says these workstations have been designed with off-the-shelf components for easy upgrades and Noctua cooling components to manage heat and reduce noise when under load.
Maingear founder and CEO, Wallace Santos, stated, “Our dedication to crafting highly capable yet budget-friendly solutions guarantees SMBs, universities, and research facilities a competitive advantage in an industry formerly restricted by multimillion-dollar investments.”