Nokia signs multiyear deal to migrate its data center infrastructure to Google Cloud

Networking equipment provider Nokia announced that it has signed a multiyear agreement to use Google as its cloud infrastructure provider. Nokia said it will migrate its global data centers and servers, as well as various software applications, onto Google Cloud infrastructure over an 18- to 24-month period.

Nokia said the deal reflects the company’s operational shift toward a cloud-first IT strategy. The cloud move is also meant to help Nokia manage its digital operations and expand collaboration capabilities for its employees working remotely amid the pandemic.

Also: 5G could generate trillions in benefits in the next decade. So why aren’t companies moving faster with it?

 Under the deal, Nokia will use a suite of Google Cloud products and services, with its infrastructure and applications running in the public cloud or via SaaS model. The companies have also worked out a customized migration schedule that will allow Nokia to exit its data centers more quickly. Google Cloud will deploy strategic systems integrators, solutions specialists, and engineers to ensure a stable migration, the companies said.

“Nokia is on a digital transformation path that is about fundamentally changing how we operate and do business,” said Ravi Parmasad, VP of Global IT Infrastructure at Nokia. “This is crucial for how our employees collaborate so that we continue to raise the bar on meeting the needs of our customers. Given Nokia’s digital ambitions and plans, this is an ideal time for Nokia to be taking this step with Google Cloud to accelerate our efforts; and doing all of this in a secure and scalable way.”

RELATED COVERAGE: 

Source Article

GE Healthcare Introduces New Edge Technology Designed to Give Clinicians Rapid Access to Critical Data

  GE Healthcare Introduces New Edge Technology Designed to Give Clinicians
  Rapid Access to Critical Data

Business Wire

CHICAGO -- October 14, 2020

GE Healthcare today introduced Edison HealthLink, new edge computing
technology designed specifically for the needs of healthcare providers, that
allows clinicians to collect, analyze and act upon critical data closer to its
source. With 10 applications already available through Edison HealthLink –
including TrueFidelity image reconstruction, Mural Virtual Care and CT Smart
Subscription – the solution gives healthcare providers another entry point
into the Edison ecosystem.

While COVID-19 has accelerated the adoption of cloud technology, analyzing
data from a distance can pose various risks to operational efficiency and
patient care. Concerns around bandwidth, network and latency challenges remain
when a matter of seconds could determine the outcome for a patient. For
example, time is critical when diagnosing and treating stroke—around 2 million
brain cells die every minute^1 until blood flow is restored, increasing the
risk of permanent damage. Running advanced post-processing software at the
edge to evaluate brain scans allows clinicians to analyze and act upon
critical data without sending it to the cloud, enabling rapid decision making.

“COVID-19 has accelerated industry-wide trends with implications for the
future of care delivery. It’s time to apply these trends and use them to
modernize the current health system infrastructure,” said Amit Phadnis, Chief
Digital Officer, GE Healthcare. “As more care delivery becomes virtual and as
more healthcare data moves to the cloud, technologies like Edison HealthLink
provide a bridge, allowing devices to operate on premise, at the edge and in
the cloud.”

By operating medical devices that connect to Edison HealthLink, health systems
can continually receive advanced software updates without requiring new
equipment – essentially extending the life of existing assets.

Built using GE’s deep healthcare domain expertise, Edison HealthLink 

Google unveils revamped Google Analytics with new ML models, more granular data controls

screen-shot-2020-10-14-at-8-58-19-am.png

Google is rolling out what it says is the biggest overhaul of Google Analytics in nearly a decade. The revamped platform features new machine learning capabilities, unified app and web reporting, native integrations and privacy updates.

With the redesign, Google said it’s aiming to provide a more modern approach to data analytics and measurement. 

Building on the foundation of the App + Web property that Google introduced in beta last year, the new Google Analytics has machine learning models baked in, along with new integrations between Analytics and Google Ads, and new controls to help customers better manage their data.

The new machine learning models can automatically alert customers to significant trends in their data, like calculating churn or purchase probability. The new property type also includes unified measurement to remove data fragmentation across devices and platforms, and more granular data controls for things like ads personalization and activity sharing. 

Google also said the new Google Analytics 4 is privacy-centric by design, with privacy-safe measurement that it plans to build out over time as the technology landscape evolves. 

“The new Analytics is designed to adapt to a future with or without cookies or identifiers,” said Vidhya Srinivasan, VP of Measurement, Analytics, and Buying Platforms at Google, in a blog post. “It uses a flexible approach to measurement, and in the future, will include modeling to fill in the gaps where the data may be incomplete. This means that you can rely on Google Analytics to help you measure your marketing results and meet customer needs now as you navigate the recovery and as you face uncertainty in the future.”

Google Analytics has been the industry standard web analytics tool since 2005. It doesn’t cost anything to sign up for and use the standard version of Google Analytics, which is ideal

Tufts University to Add New Online Master’s in Data Science and Post Baccalaureate in Computer Science | News

MEDFORD/SOMERVILLE, Mass., Oct. 14, 2020 /PRNewswire/ — Tufts University School of Engineering is collaborating with Noodle Partners, a leading online program manager (OPM), to launch a new online Master of Science in Data Science program and a Post-Baccalaureate in Computer Science. The programs are expected to launch in January 2021 with classes beginning in Fall 2021. 

“We are laser focused on building online programs that help meet the growing demand for data and computer scientists.”

The Master of Science program in Data Science is designed to prepare students who have earned bachelor’s degrees in STEM fields for advanced careers in data analysis and data-intensive science. The program focuses on statistics and machine learning, with courses in data infrastructure and systems, data analysis and interfaces, and theoretical elements. 

The Post-Baccalaureate program in Computer Science is open to individuals with at least a bachelor’s degree in any discipline (BA or BS) and one college-level introductory computer course. The program is particularly well-suited for individuals preparing to re-enter the workforce, mid-level professionals looking to move into the field of computer science, and those preparing for graduate school. 

The Department of Computer Science and the Department of Electrical and Computer Engineering jointly administer the Master of Science in Data Science, while the Department of Computer Science offers the Post-Baccalaureate in Computer Science. Students may apply to the post-baccalaureate program or to the post-baccalaureate/master’s combined program in Computer Science. 

“Building on the success of our recently launched Master of Science in Computer Science program with Noodle last fall, these two new programs in Data Science and Computer Science will help meet the soaring global demand for data engineers and computer scientists,” said Jianmin Qu, Dean of the Tufts University School of Engineering and Karol Family Professor. “In this fast-changing learning landscape,

Technology products supplier Intcomex hacked, 1TB of data stolen

Technology products supplier Intcomex Corp. has suffered a data breach and about a terabyte of its user data was released on a hacking forum.

First reported by Cybernews today, the leaked data included credit card details, passport numbers, license scans, personally identifiable information, payroll data, financial documents, customer details, employee information and more.

Parts of the data were first released for free on a Russian hacking forum Sept. 14, with more released Sept. 20. Those behind the hack are promising to release even more data in the future.

Intcomex hasn’t formally disclosed the data breach on its website, but the company did confirm the hack to Cybernews. In a tick box of standard responses, Intcomex said it had taken steps to address the situation, had “engaged third-party cybersecurity experts to assist us in the investigation and… implemented additional enhanced security measures. We also notified law enforcement. We are notifying affected parties as appropriate.”

The company is based in Miami, Florida, but offers services both in the U.S. and abroad. Although Florida doesn’t have any disclosure laws, California does. And though it’s not clear if it has clients in California, it’s arguably poor form not to disclose the details of a data breach publicly regardless of local legal requirements to do so.

“The bottom line is no company or industry is immune to cyberattack,” Adam Laub, general manager of data access governance firm Stealthbits Technologies Inc., told SiliconANGLE. “While it seems more of an inevitability than anything else at this point, the probability of successful breach and compromise at tremendous scale like this is really what organizations are somewhat in control of.”

Erich Kron, security awareness advocate at security awareness training firm KnowBe4 Inc., noted that not only is the volume of leaked data significant but the sensitivity of the contents