Google Cloud today announced Transfer Service, a new service for enterprises that want to move their data from on-premise systems to the cloud. This new managed service is meant for large-scale transfers on the scale of billions of files and petabytes of data. It complements similar services from Google that allows you to ship data to its data centers via a hardware appliance and FedEx or to automate data transfers from SaaS applications to Google’s BigQuery service.
Transfer Service handles all of the hard work of validating your data’s integrity as it moves to the cloud. The agent automatically handles failures and uses as much available bandwidth as it can to reduce transfer times.
To do this, all you have to do is install an agent on your on-premises servers, select the directories you want to copy and let the service do its job. You can then monitor and manage your transfer jobs from the Google Cloud console.
The obvious use case for this is archiving and disaster recovery. But Google is also targeting companies that are looking to lift and shift workloads (and their attached data), as well as for analytics and machine learning use cases.
As with most of Google Cloud’s recent product launches, the focus here is squarely on enterprise customers. Google wants to make it easier for them to move their workloads to its cloud, and for most workloads, that also involves moving lots of data as well.
Chalk up another win for European data protection: Microsoft has announced changes to commercial cloud contracts following privacy concerns raised by European Union data protection authorities.
The changes to contractual terms will apply globally and to all its commercial customers — whether public or private sector entity or large or small business, it said today.
The new contractual provisions will be offered to all public sector and enterprise customers at the beginning of 2020, it adds.
In October Europe’s data protection supervisor warned that preliminary results of an investigation into contractual terms for Microsoft’s cloud services had raised serious concerns about compliance with EU data protection rules and the role of the tech giant as a data processor for EU institutions.
Writing on its EU Policy blog, Julie Brill, Microsoft’s corporate VP for global privacy and regulatory affairs and chief privacy officer, announces the update to privacy provisions in the Online Services Terms (OST) of its commercial cloud contracts — saying it’s making the changes as a result of “feedback we’ve heard from our customers”.
“The changes we are making will provide more transparency for our customers over data processing in the Microsoft cloud,” she writes.
She also says the changes reflect those Microsoft developed in consultation with the Dutch Ministry of Justice and Security — which comprised both amended contractual terms and technical safeguards and settings — after the latter carried out risk assessments of Microsoft’s OST earlier this year and also raised concerns.
Specifically, Microsoft is accepting greater data protection responsibilities for additional processing involved in providing enterprise services, such as account management and financial reporting, per Brill:
Through the OST update we are announcing today we will increase our data protection responsibilities for a subset of processing that Microsoft engages in when we provide enterprise services. In the OST update, we will clarify that Microsoft assumes the role of data controller when we process data for specified administrative and operational purposes incident to providing the cloud services covered by this contractual framework, such as Azure, Office 365, Dynamics and Intune. This subset of data processing serves administrative or operational purposes such as account management; financial reporting; combatting cyberattacks on any Microsoft product or service; and complying with our legal obligations.
Microsoft currently designates itself as a data processor, rather than a data controller for these administrative and operations functions that can be linked to the provision of commercial cloud services, such as its Azure platform.
But under Europe’s General Data Protection framework a data controller has the widest obligations around handling personal data — with responsibility under Article 5 of the GDPR for the lawfulness, fairness and security of the data being processed — and therefore also greater legal risk should it fail to meet the standard.
So, from a regulatory point of view, Microsoft’s current commercial contract structure poses a risk for EU institutions of user data ending up being processed under a lower standard of legal protection than is merited.
The announced switch from the data processor to the controller should raise the bar around associated purposes that Microsoft may also provide to commercial customers of its cloud services.
For the latter purpose itself, Microsoft says it will remain the data processor, as well as for improving and addressing bugs or other issues related to the service, ensuring the security of the services, and keeping the services up to date.
In August a conference organized jointly by the EU’s data protection supervisor and the Dutch Ministry brought together EU customers of cloud giants to work on a joint response to regulatory risks related to cloud software provision.
Earlier this year the Dutch Ministry obtained contractual changes and technical safeguards and settings in the amended contracts it agreed with Microsoft.
“The only substantive differences in the updated terms [that will roll out globally for all commercial cloud customers] relate to customer-specific changes requested by the Dutch MOJ, which had to be adapted for the broader global customer base,” Brill writes now.
Microsoft’s blog post also points to other global privacy-related changes it says were made following feedback from the Dutch MOJ and others — including a rollout of new privacy tools across major services; specific changes to Office 365 ProPlus; and increased transparency regarding the use of diagnostic data.
The .NET Foundation is an independent organization run by a board of directors made up of fellow .NET community members. The foundation is working to foster innovation with .NET and support open source projects. The .NET Foundation is critical for enabling the community, and we are enthusiastic to be joining the foundation to help advance the community. AWS understands how important it is to not simply join the foundation but to embrace and advance the foundation’s goals. We take our responsibility to the community seriously. AWS will continue to listen to the customer, increase investments, and make contributions to the community
A few words from Norm Johanson, AWS’ representative to the .NET Foundation:
Fred and I have a long history building scalable, high-performance systems. I’m excited to represent AWS in the .NET Foundation. Joining the .NET Foundation as a Corporate Sponsor is an important moment in AWS’s journey – a journey that extends back to 2008.
I remember when .NET 1.0 was first coming out; I was so excited for the future of what could be built. I joined AWS in 2010 to lead the .NET development effort at AWS. AWS understood the importance of .NET, and the AWS SDK for .NET was actually the first unified AWS SDK to be released by AWS in November of 2009. Since then I have had the chance to work with countless members of the .NET community through GitHub, Twitter, and speaking at conferences to help them build on AWS, and to get a better understanding of what .NET developers expect from AWS.
During the time .NET Core was in early development, we knew this was the future of .NET. We quickly started building support for .NET Core and were able to add support during the early betas of .NET Core. We also released the first AWS Lambda .NET Core runtime for serverless applications just a few months after .NET Core 1.0 was released. Like the .NET Foundation, we know it is important to do this work in the open, so the AWS .NET team works openly on GitHub at github.com/aws/dotnet with the libraries and tools we build.
A few words from Fred Wurden, General Manager of AWS Windows and Enterprise:
Many people don’t realize how long ago we started and how high our commitment level has been across AWS to deliver a great .NET experience for our customers. Norm is the perfect representative from AWS to guide and represent the effort of many engineers across AWS in supporting the .NET Foundation. Our customers have deep and long-standing investments in .NET. They tell us they expect AWS to fully support their journey to modernize systems, databases, and applications as they transform their businesses on the cloud. Our sponsorship of the .NET Foundation is an important step to extend our commitment to .NET customers.
AWS will continue to listen to the customer, increase investments, and make contributions to the community. We look forward to working with the .NET Foundation, the .NET community at large, and our .NET customers to support the future of .NET.
Microsoft today took Azure Sentinel out of public preview and into general availability, making it an official Azure service. With Azure Sentinel, Microsoft has now officially entered the SIEM market.
SIEM stands for security information and event management (SIEM) and is a type of software used by cyber-security teams.
SIEM products can be cloud-based systems or locally-running apps. They work by gathering information from different sources, such as OS, application, antivirus, database, or server logs, and analyzing these large quantities of data for anomalies or signs of a security incident.
Because of their ability to spot the needle in the haystack,SIEM products have become widely adopted in enterprise networks, where cyber-security departments need to keep an eye on hundreds, if not thousands, of threat indicators.
Microsoft’s new Azure Sentinel service works in the same manner, except it’s also deeply integrated with Microsoft’s cloud services, such as Office 365 and the other Azure offerings, making it a go-to solution for companies running on Azure-first infrastructure.
Nonetheless, Azure Sentinel also supports importing data from a large number of third-party software solutions, and will also handle importing from any custom data streams in the Common Event Format (CEF).
Seems like everything is going to the cloud these days, so why should movie-making be left out? Today, Walt Disney Studios announced a five-year partnership with Microsoft around an innovation lab to find ways to shift content production to the Azure cloud.
The project involves the Walt Disney StudioLAB, an innovation workspace where Disney personnel can experiment with moving different workflows to the cloud. The movie production software company, Avid is also involved.
The hope is that by working together, the three parties can come up with creative, cloud-based workflows that can accelerate the innovation cycle at the prestigious movie maker. Every big company is looking for ways to innovate, regardless of their core business, and Disney is no different.
As movie making involves ever greater amounts of computing resources, the cloud is a perfect model for it, allowing them to scale up and down resources as needed, whether rendering scenes or adding special effects. As Disney’s CTO Jamie Voris sees it, this could make these processes more efficient, which could help lower cost and time to production.
“Through this innovative partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best,” Voris said in a statement. It’s the same kind of cloud value proposition that many large organizations are seeking. They want to speed time to market while letting technology handle some of the more mundane tasks.
The partnership builds on an existing one that Microsoft already had with Avid, where the two companies have been working together to build cloud-based workflows for the film industry using Avid software solutions on Azure. Disney will add its unique requirements to the mix, and over the five years of the partnership, hopes to streamline some of its workflows in a more modern cloud context.
Image Credits: Chris Pizzello/Invision / Getty Images
At IBC 2019, Microsoft announced new updates to Azure Media Services including the popular Video Indexer. Azure Media Services Video Indexer allows you to search for videos by person, object, visual text, spoken word, entity, or emotion. It can automatically extract insights and metadata from videos. With the new update, Video Indexer now supports animated character recognition and multilingual speech transcription.
Video Indexer now supports a new set of models that automatically detect and group animated characters and allow customers to then tag and recognize them easily via integrated custom vision models.
New automatic spoken language identification for multiple content features leverages machine learning technology to identify the different languages used in a media asset. Once detected, each language segment undergoes an automatic transcription process in the language identified, and all segments are integrated back together into one transcription file consisting of multiple languages.
Improved brand detection capabilities to also incorporate well-known names and locations, such as the Eiffel Tower in Paris or Big Ben in London.
The new feature adds a set of “tags” in the metadata attached to an individual shot in the insights JSON to represent its editorial type (such as wide shot, medium shot, close up, extreme close up, two shot, multiple people, outdoor and indoor, etc.).
You can learn about other updates from the source link below.
Staying connected to access and ingest data in today’s highly distributed application environments is paramount for any enterprise. Many businesses need to operate in and across highly unpredictable and challenging conditions. For example, energy, farming, mining, and shipping often need to operate in remote, rural, or other isolated locations with poor network connectivity.
With the cloud now the de facto and primary target for the bulk of application and infrastructure migrations, access from remote and rural locations becomes even more important. The path to realizing the value of the cloud starts with a hybrid environment access resources with dedicated and private connectivity.
Network performance for these hybrid scenarios from rural and remote sites becomes increasingly critical. With globally connected organizations, the explosive number of connected devices and data in the Cloud, as well as emerging areas such as autonomous driving and traditional remote locations such as cruise ships are directly affected by connectivity performance. Other examples requiring highly available, fast, and predictable network service include managing supply chain systems from remote farms or transferring data to optimize equipment maintenance in aerospace.
Today, I want to share the progress we have made to help customers address and solve these issues. Satellite connectivity addresses challenges of operating in remote locations.
Microsoft cloud services can be accessed with Azure ExpressRoute using satellite connectivity. With commercial satellite constellations becoming widely available, new solutions architectures offer improved and affordable performance to access Microsoft.
Microsoft Azure ExpressRoute, with one of the largest networking ecosystems in the public Cloud now includes satellite connectivity partners bringing new options and coverage.
SES will provide dedicated, private network connectivity from any vessel, airplane, enterprise, energy or government site in the world to the Microsoft Azure cloud platform via its unique multi-orbit satellite systems. As an ExpressRoute partner, SES will provide global reach and fibre-like high-performance to Azure customers via its complete portfolio of Geostationary Earth Orbit (GEO) satellites, Medium Earth Orbit (MEO) O3b constellation, global gateway network, and core terrestrial network infrastructure around the world.
Intelsat’s customers are the global telecommunications service providers and multinational enterprises that rely on our services to power businesses and communities wherever their needs take them. Now they have a powerful new tool in their solutions toolkit. With the ability to rapidly expand the reach of cloud-based enterprises, accelerate customer adoption of cloud services, and deliver additional resiliency to existing cloud-connected networks, the benefits of cloud services are no longer limited to only a subset of users and geographies. Intelsat is excited to bring our global reach and reliability to this partnership with Microsoft, providing the connectivity that is essential to delivering on the expectations and promises of the cloud.
Viasat, a provider of high-speed, high-quality satellite broadband solutions to businesses and commercial entities around the world, is introducing Direct Cloud Connect service to give customers expanded options for accessing enterprise-grade cloud services. Azure ExpressRoute will be the first cloud service offered to enable customers to optimize their network infrastructure and cloud investments through a secure, dedicated network connection to Azure’s intelligent cloud services.
Microsoft wants to help accelerate scenarios by optimizing the connectivity through Microsoft’s global network, one of the largest and most innovative in the world.
ExpressRoute for satellites directly connects our partners’ ground stations to our global network using a dedicated private link. But what does it more specifically mean to our customers?
Using satellite connectivity with ExpressRoute provides dedicated and highly available, private access directly to Azure and Azure Government clouds.
ExpressRoute provides predictable latency through well-connected ground stations, and, as always, maintains all traffic privately on our network – no traversing of the Internet.
Customers and partners can harness Microsoft’s global network to rapidly deliver data to where it’s needed or augment routing to best optimize for their specific need.
With some of the world’s leading broadband satellite providers as partners, customers can select the best solution based on their needs. Each of the partners brings different strengths, for example, choices between Geostationary (GEO), Medium Earth Orbit (MEO) and in the future Low Earth Orbit(LEO) satellites, geographical presence, pricing, technology differentiation, bandwidth, and others.
ExpressRoute over satellite creates new channels and reach for satellite broadband providers, through a growing base of enterprises, organizations and public sector customers.
With this addition to the ExpressRoute partner ecosystem, Azure customers in industries like aviation, oil and gas, government, peacekeeping, and remote manufacturing can deploy new use cases and projects that increase the value of their cloud investments and strategy.
As always, we are very interested in your feedback and suggestions as we continue to enhance our networking services, so I encourage you to share your experiences and suggestions with us.
Microsoft Authenticator is an app designed to help users sign into their accounts using two-factor authentication. It can enable passwordless sign-in; respond to a prompt for authentication after signing-in with username/password; or act as a code generator for any other accounts supporting authenticator apps.
Microsoft has been rolling this feature out gradually over the past few weeks. As of today, September 12, “it’s now 100% available for version 6.6.0+,” Microsoft’s blog post says. Credentials will remain updated even when users add, delete or edit accounts, officials said.
To turn on cloud backup, Authenticator users can go to settings and then, under “Backup,” set the cloud backup toggle to on. To recover account credentials on a new device, users can select “Begin Recovery” as an account option to be able to sign in using the same Microsoft account as was on their previous devices.
Microsoft Authenticator is available for iOS and Android devices. On iOS, users must have an iCloud account for the storage location. Both Android and iOS users need a personal Microsoft account to act as their recovery accounts. Only users’ personal and third-party account credentials are stored by Authenticator, meaning username and account verification code that’s required to prove identity. No other account information is stored, Microsoft officials say.