Categories
Featured

AT&T resets thousands of user passwords as it confirms breached data was its own after all

[ad_1]

American telecommunications behemoth AT&T has finally confirmed the authenticity of the 2021 data breach that spilled sensitive user information on the dark web, and has initiated a mass reset of user passcodes.

Roughly three years ago, privacy blog RestorePrivacy broke the news of a hacker selling sensitive data belonging to more than 70 million AT&T customers. The data allegedly contained people’s names, phone numbers, postal addresses, email addresses, social security numbers, and dates of birth. 

[ad_2]

Source Article Link

Categories
Featured

Leveraging big data for strategic business decisions

[ad_1]

Organizations today heavily rely on big data to drive decision-making and strategize for the future, adapting to an ever-expanding array of data sources, both internal and external. This reliance extends to a variety of tools used to harness this data effectively.

In the modern business environment, with an estimated 2.5 quintillion bytes of data generated daily, big data is undoubtedly pivotal in understanding and developing all aspects of an organization’s goals. However, known for its vast volume and rapid collection, big data can overwhelm and lead to analysis paralysis if not managed and analyzed objectively. But, when dissected thoughtfully, it can provide the critical insights necessary for strategic advancement.

The evolution of big data in business strategy

[ad_2]

Source Article Link

Categories
Featured

Navigating the shift to AI-driven data management

[ad_1]

In today’s dynamic business landscape, data management stands as a critical cornerstone, directly influencing an organization’s agility and innovation capabilities. The digital age demands that companies reassess their data management strategies, particularly the reliance on traditional master data management (MDM) systems. These legacy systems, often entrenched due to the ‘sunk-costs’ fallacy, hinder progress and adaptability, locking businesses into outdated practices that impede growth.

Rules-based MDM solutions, with their rigid frameworks and manual-intensive operations, are increasingly misaligned with the needs of modern data environments. They struggle to manage the diversity and volume of data generated today, leading to inefficiencies that can ripple through an organization, affecting everything from decision-making speeds to customer experience and the ability to capitalize on emerging opportunities.

[ad_2]

Source Article Link

Categories
Featured

Why bigger data sets doesn’t mean better insights

[ad_1]

‘Data is the new oil’ was term coined by British mathematician Clive Humby 2006. It’s become an overused phrase largely meaning that if your organization has access to vast amounts of data, you can use it to aid decision making and drive results.

While there is great truth in that having access to data can lead to greater business intelligence insights, what companies actually need is access to ‘good’ data and its insights. However, knowing what makes data valuable is something that many still struggle with. With considerations often including factors such as quantity, age, source or variety, not truly understanding what type of data is good for business means it’s easy to get lost in data sets that are ultimately poor quality and bad for decision making.

The big cost of the wrong big data

[ad_2]

Source Article Link

Categories
Featured

VR headsets could be hacked in “Inception-esque” attacks — with attackers able to steal your data without you even noticing

[ad_1]

If someone were to infect your Meta Quest VR headset with malware, they could trick you into seeing things in the virtual world which weren’t real, experts have warned.

Academics from Cornell University recently published a paper describing the possibility of hijacking people’s VR sessions and controlling their interactions with internal applications, external servers, and more. 

[ad_2]

Source Article Link

Categories
Computers

The NSA Warns That US Adversaries Free to Mine Private Data May Have an AI Edge

[ad_1]

Electrical engineer Gilbert Herrera was appointed research director of the US National Security Agency in late 2021, just as an AI revolution was brewing inside the US tech industry.

The NSA, sometimes jokingly said to stand for No Such Agency, has long hired top math and computer science talent. Its technical leaders have been early and avid users of advanced computing and AI. And yet when Herrera spoke with me by phone about the implications of the latest AI boom from NSA headquarters in Fort Meade, Maryland, it seemed that, like many others, the agency has been stunned by the recent success of the large language models behind ChatGPT and other hit AI products. The conversation has been lightly edited for clarity and length.

Person in a suit smiling in front of the American and National Security Agency flags

Gilbert HerreraCourtesy of National Security Agency

How big of a surprise was the ChatGPT moment to the NSA?

Oh, I thought your first question was going to be “what did the NSA learn from the Ark of the Covenant?” That’s been a recurring one since about 1939. I’d love to tell you, but I can’t.

What I think everybody learned from the ChatGPT moment is that if you throw enough data and enough computing resources at AI, these emergent properties appear.

The NSA really views artificial intelligence as at the frontier of a long history of using automation to perform our missions with computing. AI has long been viewed as ways that we could operate smarter and faster and at scale. And so we’ve been involved in research leading to this moment for well over 20 years.

Large language models have been around long before generative pretrained (GPT) models. But this “ChatGPT moment”—once you could ask it to write a joke, or once you can engage in a conversation—that really differentiates it from other work that we and others have done.

The NSA and its counterparts among US allies have occasionally developed important technologies before anyone else but kept it a secret, like public key cryptography in the 1970s. Did the same thing perhaps happen with large language models?

At the NSA we couldn’t have created these big transformer models, because we could not use the data. We cannot use US citizen’s data. Another thing is the budget. I listened to a podcast where someone shared a Microsoft earnings call, and they said they were spending $10 billion a quarter on platform costs. [The total US intelligence budget in 2023 was $100 billion.]

It really has to be people that have enough money for capital investment that is tens of billions and [who] have access to the kind of data that can produce these emergent properties. And so it really is the hyperscalers [largest cloud companies] and potentially governments that don’t care about personal privacy, don’t have to follow personal privacy laws, and don’t have an issue with stealing data. And I’ll leave it to your imagination as to who that may be.

Doesn’t that put the NSA—and the United States—at a disadvantage in intelligence gathering and processing?

II’ll push back a little bit: It doesn’t put us at a big disadvantage. We kind of need to work around it, and I’ll come to that.

It’s not a huge disadvantage for our responsibility, which is dealing with nation-state targets. If you look at other applications, it may make it more difficult for some of our colleagues that deal with domestic intelligence. But the intelligence community is going to need to find a path to using commercial language models and respecting privacy and personal liberties. [The NSA is prohibited from collecting domestic intelligence, although multiple whistleblowers have warned that it does scoop up US data.]

[ad_2]

Source Article Link

Categories
Featured

Data loss affected four out of five organizations in 2023

[ad_1]

Four out of five organizations around the world (85%) suffered at least one data loss incident last year. 

This is according to a new report from cybersecurity researchers Proofpoint, which says that most of the time, it’s not the computers’ fault – it’s ours.

[ad_2]

Source Article Link

Categories
Featured

AT&T denies leaked data of 70 million people is from its systems

[ad_1]

A hacker is selling a huge archive on the dark web, claiming it originated from a 2021 data breach at American telecommunications giant AT&T – however the company denies the data originated from its servers.

BleepingComputer reported a threat actor with the alias ShinyHunters posted an ad on the RaidForums for the sale of sensitive data belonging to 71 million AT&T customers.

[ad_2]

Source Article Link

Categories
Featured

‘Algae recycled into energy’: How one of Europe’s largest data center firms wants to harness heat from GPUs and others to grow sustainable marine flora — but will it be worth it?

[ad_1]

Data centers produce a lot of waste heat that could one day be recycled and used to heat millions of homes.

Now, French data center company Data4 has partnered with the University of Paris-Saclay to launch a project that aims to use data center heat to grow algae, which can then be recycled into energy. The pilot project, set to commence early in 2024, will be trialed in the Paris region.

[ad_2]

Source Article Link

Categories
Featured

Newly discovered Microsoft Z1000 SSD baffles experts — no, world’s most valuable company won’t start selling SSDs anytime soon but it may well be tinkering with data center storage as it did with CPU

[ad_1]

A newly-discovered, Microsoft-branded SSD suggests the tech giant may be – or has been at least – exploring new ways to optimize its data center storage.

The leaked images of a Microsoft Z1000 SSD show a 1TB NVMe M.2 drive, apparently boasting sequential read speeds of up to 2,400MB/s and write speeds of 1,800MB/s.

The Z1000 SSD, originally revealed by @yuuki_ans on X, is made up of a mix of components from various companies, including Toshiba NAND flash chips, Micron’s DDR4 RAM cache, and a controller from CNEX Labs, a company best known for its work with data center hyperscalers.

Microsoft Z1000 SSD

(Image credit: @yuuki_ans on X)

Up to 4TB capacity



[ad_2]

Source Article Link