Breaking

Monday, April 30, 2018

4/30/2018 10:31:00 PM

iPhone 8, 8 Plus Product Red edition launched starting at Rs 67,490 in India

India is a part of the third wave of iPhone 8 Red Edition launches.


We already saw the iPhone dressed in red last year with the iPhone 7 and 7 Plus, and the trend continues with the Apple announcing the iPhone 8 and 8 Plus Product Red Edition. The company’s red variants contribute towards raising money to fight HIV/AIDS.

The iPhone 8 Product Red edition with 64GB storage will cost Rs 67,940 whereas the 128GB variant will cost Rs 81,500. The iPhone 8 Plus Product Red edition, on the other hand, will cost Rs 77,560 for the base 65GB variant and the 256GB model will cost Rs 91,110.

Both the devices go on sale today and can be found on the official Apple India website. 

The new models of the Product Red portfolio were launched internationally starting April 13. Countries like Australia, Canada, China, France, Germany, South Korea, UK and the US already have the iPhone 8 and 8 Plus Red editions available to them. In fact, India among the third wave of launches.

Though not the pioneers of bringing the color red to phones, they were definitely the first ones to design an entire marketing campaign around the color. After the Apple’s launch, brands like Oppo, Vivo, and OnePlus followed suit. 

That being said, it’s for a good cause. All the money that Apple receives from the sale of Red Edition phones goes towards Red, the global fund for HIV/AIDS grants. More specifically, Red is a company that seeks to engage private sector companies with raising funds and increasing awareness about the epidemic of  HIV/AIDs in eight African nations.

As per reports from December, Apple had raised over $160 million (Rs 1,038 crore) for help in Red’s efforts. So far they’re the largest donor for the organization.

  
4/30/2018 09:34:00 PM

Ubuntu 18.04 LTS: The Linux for AI, clouds, and containers

Ubuntu will still live on as a desktop operating system, but that's not where Canonical sees it as having its greatest potential.


Even in 2018, if you ask most people what they know about Ubuntu, they'll tell you it's a desktop Linux. Oh, but there's so, so much more to Canonical's Ubuntu than that, and in its latest long-term support (LTS) release, Ubuntu 18.04 LTS, that really shows up.

In a conference call interview, Mark Shuttleworth, Canonical's CEO and Ubuntu's founder, said, "Most public cloud instances -- Azure, AWS, Oracle, and so on -- are Ubuntu. To better support Ubuntu, 18.04 features improvements in network and storage and improved boot time optimization so that Ubuntu instances can ramp up faster with demand. In addition, Canonical has been working with NVIDIA to improve its public cloud General Purpose GPU (GPGPU) support."

Shuttleworth added: "Multi-cloud operations are the new normal. Boot-time and performance-optimized images of Ubuntu 18.04 LTS on every major public cloud make it the fastest and most efficient OS for cloud computing, especially for storage and compute-intensive tasks like machine learning."

NVIDIA GPGPU hardware acceleration is built into Ubuntu 18.04 LTS cloud images and Canonical's OpenStack and Kubernetes distributions for on-premise bare metal operations, supporting Kubeflow, and machine learning (ML) and AI workflows.

This has lead Canonical to working closely with Google, IBM, and NVIDIA to improve Ubuntu's ML support. Shuttleworth specifically mentioned its work with Google on Kubeflow as an example. This is a new open-source project dedicated to making using ML stacks on Kubernetes easy, fast, and extensible.

Kubeflow, the Google approach to TensorFlow on Kubernetes, and a range of CI/CD tools are integrated in Canonical Kubernetes and aligned with Google GKE for on-premise and on-cloud AI development.

"Having an OS that is tuned for advanced workloads such as AI and ML is critical to a high velocity team" added David Aronchick, product manager of Cloud AI at Google. "With the release of Ubuntu 18.04 LTS and Canonical's collaborations to the Kubeflow project, Canonical has provided both a familiar and highly performant operating system that works everywhere. Whether on-premise or in the cloud, software engineers, and data scientists can use tools they are already familiar ... and greatly accelerate their ability to deliver value to their customers."

Canonical is also continuing to support the OpenStack Infrastructure-as-a-Service (IaaS) cloud.

Shuttleworth claimed Canonical OpenStack delivers private cloud with significant savings over VMware with a modern, developer-friendly Application Programming Interface (API). With built-in support for NFV and GPGPUs, the Canonical OpenStack offering has become a reference cloud for digital transformation workloads. Today, Ubuntu is at the heart of the world's largest OpenStack clouds, both public and private, in key sectors such as finance, media, retail, and telecommunications.

Shuttleworth slammed VMware: "VMware is expensive. OpenStack is replacing it."

In addition, Canonical's Distribution of Kubernetes (CDK) runs on public clouds, VMware, OpenStack, and bare metal. It delivers the latest upstream version, currently Kubernetes 1.10. Many Canonical partners deliver their solutions on CDK, such as Rancher 2.0, a popular container management program.

Shuttleworth continued, "We think Kubernetes is a commodity. Our pure Kubernetes is delivered as a free service on top of VMs. It's the simplest and most cost-efficient solution, and with it, you can scale Kubernetes from desktop to rack and out to the public cloud."

This is in contrast, he said, to Red Hat, which has integrated Kubernetes into OpenShift, Red Hat's Platform-as-a-Service (PaaS). This is true, with the release of Red Hat Enterprise Linux (RHEL) 7.5. Red Hat announced, "With Red Hat Enterprise Linux 7.5, we're ... completely deprecating the Kubernetes RPMs and container image." If you want to use Kubernetes in production, "We recommend that customers evaluate Red Hat OpenShift for a fully supported container platform based on Kubernetes."

With Ubuntu 18.04, CDL also supports GPGPU acceleration of workloads using the NVIDIA DevicePlugin. Applications built and tested with Kubeflow and CDK are perfectly transportable to Google Cloud.

Developers on Ubuntu can create applications on their workstations, test them on private bare-metal Kubernetes with CDK, and run them across vast data sets on Google's GKE. The resulting models and inference engines can be delivered to Ubuntu devices at the edge of the network, creating a perfect pipeline for machine learning from workstation to rack to cloud and device.

Canonical is also using LXD, it's container hypervisor, with 'lift-and-shift' legacy workloads into containers for performance and density. LXD does this by providing 'machine container,' which behave like virtual machines. They can contain a full Linux guest operating system such as Ubuntu, RHEL, or CentOS. This provides a traditional administration environment for legacy applications, which run, Canonical claims, at bare metal speeds with no hypervisor latency.

As Shuttleworth explained, "Enterprises are realizing legacy apps aren't comfortable in the Kubernetes world. LXD, a system container that behaves like a VM, lends itself well for traditional applications." He added there are "tools available to move legacy apps from hardware and virtual machines to LXD. For example, you can run older RHEL instances without the Intel Meltdown and Spectre patches on LXD on Ubuntu 18.04, with the patches." Shuttleworth concluded, "80 percent of legacy apps can run in LXD. It all depends on the app's need for a specific kernel. If they don't need a specific Linux kernel, they can run in LXD."

Thinking of Legacy with a capital L applications, Ubuntu 18.04 also supports IBM mainframes. "Canonical and IBM have been working closely together to offer cloud solutions with Ubuntu on IBM LinuxONE and IBM Z," said Michael Desens, IBM's VP of Offering Management, IBM Z, and LinuxONE.

Don't let all this talk about clouds and Kubernetes fool you, though. Canonical is still supporting the desktop. The new Ubuntu 18.04 LTS comes with a default GNOME desktop as a replacement for its Unity desktop. It also natively supports the KDE, MATE, and Budgie desktops.

Shuttleworth also boasted of the growing popularity of its snap package manager system. Snaps, which started in Ubuntu Touch, are now a Linux distro-agnostic upstream software delivery system. Canonical claims there are more than 3,000 snaps published and millions installed, including official releases from Spotify, Skype, Slack, and Firefox. With snaps, publishers can deliver software updates directly and security is maintained with enhanced kernel isolation and system service mediation.

"Snaps enables us to access more Linux users and opens the market for us to accommodate more distributions," said Jonáš Tajrych, senior software engineer of Microsoft's Skype. Snaps reduced, he said, "the complexity and time of maintaining several packages across multiple distributions. In addition, we want our users to consistently experience the latest and greatest version of Skype and the automatic update feature allows us to seamlessly deliver this to them. It's such a promising format and an asset for developers to help create unification."

If you want a "just the basics" Linux desktop, Canonical is also providing a new minimal desktop. This provides only the core desktop and browser. In businesses, the minimal desktop can serve as a base for custom desktop images with just the applications you want and a smaller attack surface.

And, continuing on with desktops, Ubuntu will run better than ever... with the Windows desktop.

New Hyper-V-optimised images developed in collaboration with Microsoft enhance the virtual machine experience of Ubuntu in Windows. "In our upcoming OS release this spring, Hyper-V's Quick Create VM Gallery will now include an image for the latest Ubuntu 18.04 LTS, officially stamped straight from Canonical," said Craig Wilhite, program manager at Microsoft. "This Ubuntu VM image will come pre-configured to offer clipboard functionality, drive redirection, dynamic resizing of VM console window, and much more, as we look to provide a great Hyper-V client VM experience for Linux on Windows."

So, while many of you may still be using the Ubuntu desktop, Canonical is making it very clear that Ubuntu has a larger role to play on clouds with containers and even with an old rival: Microsoft.


4/30/2018 07:39:00 PM

Microsoft's Build 2018 session highlights: Microsoft 365, Graph interface, MSIX packaging, more

Microsoft's early-May Build developer conference is coming into further focus as the company reveals more (but still not all) of the sessions it's planning for the three-day event.


On April 25, Microsoft posted titles and abstracts of hundreds of its sessions for its three-day Build 2018 developer conference, which kicks off on May 7. (To see the expanded session list, registered users must log into the Build web site.) In late March, Microsoft posted a couple of dozen of the Build 2018 sessions, with a focus on IoT, blockchain and data science.

While every attendee has her/his own areas of focus, I found a few of the items now listed to be of interest.

Day 1 starts with a 2.5-hour "vision" keynote anchored by Microsoft CEO Satya Nadella, followed immediately by another 1.5-hour technology keynote on the intelligent cloud and edge, headlined by Executive Vice President Scott Guthrie.

There will be a kick-off keynote on Day 2, after all, with Corporate Vice President Joe Belfiore presenting on "Microsoft 365 Application Development." The fact this is focused on Microsoft 365 -- Microsoft's bundle of Windows 10, Office 365 and Enterprise Mobility + Security -- and not Windows specifically reflects Microsoft's most recent reorg, which resulted in Belfiore taking on a new role in the organization responsible for Microsoft 365.

Even though Microsoft's latest reorg cleaved the Windows and Devices Group in two, there are still 97 sessions on the still-incomplete Build session list that mention Windows. Microsoft's recently introduced MSIX application-packaging initiative also is covered by multiple sessions, as is the Windows 10 XAML framework and Progressive Web Apps, which Microsoft will be supporting with its Windows 10 Redstone 4 release.

Microsoft officials have said previously to expect Build to focus heavily on AI, IoT and new experimental areas like quantum computing. There are lots of sessions dedicated to Microsoft's various database platforms, several mentioning bots (including "Enterprise Calling and Meetings Bots for Microsoft Teams"), and some dedicated to mixed/augmented reality.

The Microsoft Graph sessions look interesting, including one focusing on building mobile apps using the Microsoft Graph API (application programming interface) and Xamarin, and another on how to build security apps using the Graph API. The Microsoft Graph API, for those looking for a refresher, is centralized API meant to help surface more contextual information in order to make apps smarter and stickier.

Microsoft is planning to live stream some Build sessions and make many of the others available publicly on demand. However, sessions that are labeled as "Sneak Peeks" are focused on Microsoft getting feedback from attendees and likely won't be made available to watch by those, not in attendance.

The "UI platform for AR/VR/MR devices" session on May 8 is a sneak peek, and notes that Microsoft is "considering changes to the UWP platform to make it easy to build immersive experiences for AR/VR/MR devices," including the creation of AR on 2D devices. There are an "Open source and backward compatibility for the Windows 10 XAML framework" on May 8 that's also labeled "sneak peek," which indicates that Microsoft is considering releasing parts of the Windows 10 XAML/Fluent UI framework as open source. And the "Bringing cloud powered UI to Windows 10 XAML applications" session on May 9 is also a sneak peek, covering how reactive fits into the Microsoft developer story.

The (in)famous Raymond Chen is one of the presenters in a May 9 session on "Developing for Sets on Windows 10," which could be of potential interest to any/all Windows 10 developers. Also on May 9, in the "Fluent Design System Inside of Microsoft: Office" session is going to detail how Windows and Office are working together to bring "Office's productivity expertise to the Fluent design language."

A big theme for Microsoft at Build this year will be convincing developers they need to understand so as to incorporate AI technologies in their apps. There are several sessions at Build focused on this topic, including the May 9 breakouts on "10 Things Developers Need to Know About Building Intelligent Apps," and the "Conversational AI: Best Practices for Building Bots," which will offer guidance for using the Bot Builder v4 software development kit, cognitive services and more.


4/30/2018 04:16:00 PM

LG launches portable instant camera-printer hybrid

LG's Photo Pocket Snap is a hybrid of instant camera and portable printer that will allow users to print photos immediately after they take them.



LG Electronics has begun sales of Pocket Photo Snap, its instant camera, and portable printer hybrid, in South Korea.

The PC389 will allow consumers to take photos with its 5 million pixel camera and print out a copy of the photo on the spot.

The reprint button will also allow them to print out multiple copies to share with friends, unlike a conventional instant camera, LG said.

It is charged via USB Type-C and can print 30 photos at top charge. In Korea, it costs 249,000 won ($233) while the printing paper costs 25,000 won ($23) per 36 units.

Pressing the shutter button for 5 seconds will convert it to black and white mode.

The printer can also be connected to smartphone and tablets through Bluetooth to print out photos from other devices. It supports both Android and iOS.

Users can also download a photo editor app for Pocket Photo Snap to edit photos taken by the portable camera before printing them.

LG is preparing to launch its first flagship phone this year, the G7 ThinQ, this week.

Last week, LG Electronics posted sales of 15.12 trillion won ($14.1 billion) and operating profit of 1.1 trillion won ($1.03 billion) for Q1 2018.

The company's mobile division, however, saw an operating loss of 136.1 billion won. Mobile brought in sales of 2.16 trillion won, down from 2.99 trillion won in Q1 2017.


4/30/2018 01:30:00 AM

Apple is reportedly making a VR and AR headset with an 8K display per eye

And it could launch in 2020


Apple is developing a wireless headset that can be used for both virtual reality (VR) and augmented reality (AR), according to a new report. 

Impressively, the VR/AR standalone headset will feature displays with resolutions of 8K (7680 x 4320) per eye, according to a person familiar with Apple's plans who divulged the details to CNET. 

For comparison, the HTC Vive Pro, the latest high-end VR headset on the market, has a resolution of 2880 x 1600.

Unlike the HTC Vive and Oculus Rift, Apple's rumored headset wouldn't be tethered to a PC. What's more, it also wouldn't need a smartphone to run, like Google Daydream View and Samsung Gear VR. 

Instead, it would be connected wirelessly to a box housing a powerful Apple-made processor, reports CNET. Apple has been rumored to be developing its own processors for Mac computers with the aim of discarding Intel chips by 2020 so the headset-powering processor could be part of this initiative.

Apple is reportedly targeting a 2020 launch for the headset. It could cancel the headset plans at any time, of course.

Why not both?

After all of Apple's time and energy spent promoting AR, it's interesting to see that the company is (reportedly) combining the technology that overlays digital renders onto the real world (AR) with VR, which takes users into a wholly virtual world. 

CEO Tim Cook has long touted AR's advantages over VR, including saying that AR enhances someone's experience while keeping them present in what's going on around them. 

Just a few months ago, Cook said during an earnings call with investors that he sees, "AR as being profound."

"AR has the ability to amplify human performance instead of isolating humans," he said on the same call. "So I am a huge, huge believer in AR. We put a lot of energy on AR. We're moving very fast."

And then there's ARKit, the platform Apple launched with iOS 11 that lets developers create AR apps for the iPhone and iPad. 

But despite its heavy focus thus far on AR, it seems Apple is keen to give users a choice in what they experience; either AR for an enhanced experience that keeps them present in the real world or VR for those times users want to be transported somewhere else entirely.


Sunday, April 29, 2018

4/29/2018 10:34:00 PM

If your cloud apps don’t have APIs, you’re doing it wrong

Although many developers believe that APIs are optional, the more you understand their value, the more successful your cloud app will be.


If you are building new applications on public cloud platforms, you are faced with a choice: Should you build a set of APIs bound to the new cloud application’s application services? Or should you look for another job?

I’m seeing many new cloud applications that are built to fail. Not because they are poorly designed, but because they are leaving out the power of using application services with well-defined and designed APIs that let other applications access those services.

This use of APIs is no longer optional. The cloud platforms themselves are API-driven. They provide storage services, provisioning services, database services, etc., and they like to work with other things that use APIs.

Not having APIs means that you’re providing only a single path to access for the cloud-based applications: the user interface. Moreover, and most important, you’re not providing a path for reuse of the applications services.

Say you write a killer application that does secure banking transactions. You will certainly provide a user interface, but to make the application services reusable, you need to provide APIs or services (that is, microservices) as well. An application that provides 330 functions or behaviors should also provide at least 330 APIs as well—and typically, many more.

Why? Although users will consume the application services using the user interface, there are opportunities to reuse much of the application, by other applications, so you are not reinventing the wheel for common services such as transaction validation or blockchain update. You certainly can find other uses for those services, and once they are discoverable by others in the dev shop, they will indeed find other purposes.

The benefit of this is financial: It’s cheaper to build new applications from existing services than to do it all from scratch. The benefit is also agility. If you can build or change applications that are API- or service-based, that will translate into additional business agility. Agility is really the best way to define the value of cloud.


4/29/2018 09:26:00 PM

Apple makes yet another short-sighted decision

Apple has discontinued a product that it should have made a cornerstone of its home automation and entertainment ecosystem.


After months of rumors, Apple has finally killed off its line of AirPort Wi-Fi routers.

Big mistake.

The AirPort line of routers is a product from a different era. While nowadays everyone has Wi-Fi, pretty much everywhere they go, AirPort was built for a time when this wasn't the case, and setting up secure, easy-to-use Wi-Fi network was far from easy.

So AirPort is no longer needed?

Wrong.

While I think that the days of needing a standalone router are maybe gone, AirPort was more than just a router. One of the models that Apple sold was the Time Capsule, a version that housed a hard drive and made backing up Mac's easy.

And AirPort could have fitted in well with other projects that Apple is working on. For example, Combining the AirPort Time Capsule with the Apple TV could have resulted in a very interesting device indeed.

Throw in the functionality of the HomePod and the device becomes the ultimate home hub, blowing away the Amazon Echo and the Google Home.

Apple's abandoning of the AirPort line is yet another sign that the company is more interested in chasing the mass market - riding the iPhone wave if you want to think of it that way - than it is about building a broad and functioning ecosystem.

While Wi-Fi routers might not be the cool, cutting-edge gadgets they were once seen as, people still need them, and many are confused and bewildered by even the simplest of networking tasks. And yet combining the features of a Time Capsule AirPort router with that of the Apple TV, and add in a sprinkling of HomePod, and Apple would have had the Swiss Army Knife of home hubs, bringing together networking, backup, and home entertainment into a single device.

This seems like a missed opportunity to me, and Apple has vacated a space that I now think that a company such as Amazon (a better fit for a project like this than Google for a number of reasons) should enter.

On the plus side for Apple, perhaps getting rid of smaller side projects such as AirPort might free up company resources to do things like fix iOS and maybe even make the iPhone great again.



4/29/2018 07:23:00 PM

Microsoft's latest idea? How foldable two-screen mobile could use 'hinge gestures'

If Microsoft ever launches a foldable phone, it could have these new hinge controls.


Microsoft has published a patent detailing how it will overcome the challenges of interacting with a foldable device.

It's the latest in a run of patents from Microsoft focusing on foldable mobile devices and follows one in March describing a sophisticated hinge-mechanism to support a dual-screen folding mobile.

The patents may be a part of Microsoft's rumored Andromeda folding tablet.

The latest patent, first reported on Windows Central, was filed in 2016 by Microsoft and describes "input based on interaction with a physical hinge", which outlines complexities introduced to touch input on dual-screen hinged device compared with a single-screen device.

"Consequently, a typical gesture language may be inefficient for these devices given the ergonomics of holding and interacting with such a device. This can detract from user enjoyment and lead to user frustration when using these types of devices," Microsoft's engineers write.

Microsoft says the hinge design necessitates a new class of "hinge-based interactions" that involve the user moving the hinge to provide the computer with an input.

The hinge gesture could be used to start system-level commands, operations, interactions with content, and initiate transitions between views. Microsoft suggests a hinge angle change could be used to switch between a single-tasking state and multitasking state.

The company also considers the possibility of combining other input signals with hinge gestures, for example, for multitasking actions, to launch a related app, or to create a different view in the same app.

These signals could include the speed a hinge is moved at, how the user is gripping the device, and the screen's orientation.

If hinge gestures ever become a thing, Microsoft's engineers think it would be good to provide user feedback to show how far the user has to go to complete a gesture, such as a progress bar indicating what percent of a given action has been completed.

They also think audio and haptic feedback would be useful to tell the user if a gesture is being performed correctly or incorrectly.

Other recent foldable phone patents from Redmond include its self-regulating hinge concept and one from 2014 showing off a tablet-like foldable device.


Microsoft's patent shows how hinge gestures could be used to start commands, operations, and interactions.
4/29/2018 04:48:00 PM

iTunes arrives on the Microsoft Store – at last

Apple breaks into Windows


iTunes has finally landed in the Microsoft Store. The launch comes three months after eagle-eyed Apple fans noticed small tweaks to the desktop software suggesting a Windows app was in the works, and almost a year after it was announced at Microsoft's Build 2017 conference.

This makes iTunes available to users of Windows 10 S (or S Mode, as it will soon become), who are unable to install software from outside the Windows Store.

We’d expected the app to arrive alongside the Windows 10 Spring Creators Update (or Redstone 4, or April Creators Update), but with that launch still in purgatory, it seems Microsoft has relented and let Apple go ahead.

Apple juice

iTunes for Windows is a straight port of the existing desktop software but packaged in a way that enabled distribution via the Microsoft Store. The old desktop software won’t update to the new app automatically though; when you install the app, the installer will offer to remove the legacy version for you. Your music library won’t be affected.

Don’t expect a Google Play Music app to arrive in the Microsoft Store any time soon – Google had a cheeky try at smuggling a Chrome installer into the Microsoft Store, which Microsoft swiftly removed, citing security issues. We doubt it'll be opening the doors to other Alphabet apps any time soon.



4/29/2018 01:45:00 PM

The controversial Ubuntu 18.04 LTS is now available to download

Latest long-term support ditches the 32-bit installer


Canonical has released its latest Long Term Support (LTS) version of its popular Ubuntu Linux distribution, and it continues Canoncial’s habit of giving releases an alliterative animal-based name, with the distro also known as Bionic Beaver.

Ubuntu 18.04 LTS comes with plenty of new features, while building upon the changes that we brought in last year’s Ubuntu 17.10, and this means that Canonical’s own Unity desktop environment is no longer used, in favour of GNOME. 

This switch, which debuted in Ubuntu 17.10, has been a long time coming. Mark Shuttleworth, founder of Canoncial, the company behind Ubuntu, last year confirmed on his personal Google+ account that Canonical “will invest in Ubuntu GNOME with the intent of delivering a fantastic all-GNOME desktop”, which put paid to Canoncial’s aim of getting Ubuntu on phones and mobile devices as well.

Another big change, which will surely stoke controversy, is that that Canoncial is no longer providing 32-bit installer images of Ubuntu. Again, this change was first seen with Ubuntu 17.10, and now it has arrived in the LTS version. For people relying on long term support for their older 32-bit machines, this could cause problems.

However, it follows similar moves by other distros that have dropped 32-bit support, as the industry moves away from older hardware to embrace 64-bit. For example, Tails Linux and Arch Linux have ditched 32-bit support in the past year.


Data collection

The most controversial change with Ubuntu 18.04 LTS, however, is the fact that Canonical will now be collecting system usage data, something that’s sure to rile up privacy-conscious Linux users who have dismissed operating systems such as Windows 10 for similar reasons.

This means Canonical receives data on the version of Ubuntu installed, the manufacturer of the machine you’re running it on, CPU model, desktop environment, packages installed and more.

This data collection is turned on by default, but to Canonical’s credit it appears to be pretty easy to switch off, and the company insists that the data it collects will not be enough to identify users or machines.

The data will also be made publicly available, which could give us a better idea of how popular certain flavors of Ubuntu and its software are, and how it is used. The data could also be used to help improve future versions of Ubuntu, but it remains to be seen if this is enough to appease Ubuntu users who value their privacy.

Bionic beavering away

But, let’s get back to the positive changes brought with Ubuntu 18.04 LTS. It comes with the latest Linux kernel (4.15), hardware acceleration for Nvidia graphics cards for the cloud version of 18.04, faster boot times and a new Hyper-V optimised image for improved performance when running Ubuntu 18.04 LTS on a virtual machine inside Windows 10.

This last feature was worked on in conjunction with Microsoft. Craig Wilhite, Program Manager, Microsoft, has said that “In our upcoming OS release this spring, Hyper-V’s Quick Create VM Gallery will now include an image for the latest Ubuntu 18.04 LTS, officially stamped straight from Canonical”.

We assume by upcoming OS release this spring Wilhite is talking about the upcoming Windows 10 April Update, which has been delayed but should be coming out soon.

For more information about what’s coming in Ubuntu 18.04 LTS – both good and bad – you can check out the official release announcement, and make sure you read our in-depth interview with Canonical about this divisive new release.


Saturday, April 28, 2018

4/28/2018 10:37:00 PM

Windows 10 April Update isn’t here yet, but at least Microsoft has honed its rollout process

The pace of Fall Creators Update rollout was a marked improvement.


The speed that big updates for Windows 10 rollout – with phased deployment over a number of months – has been notoriously sluggish in the past, but some new figures show that Microsoft has quickened things considerably.

The latest statistics from AdDuplex show that on the verge of the next big update (supposedly called the April Update), Microsoft has now delivered the previous upgrade – Fall Creators Update – to 92.1% of Windows 10 systems.

That’s the vast majority of Windows 10 PCs, and it’s particularly interesting when you compare it to the speed that the Creators Update (which arrived last spring) rolled out – something like a quarter of Windows 10 users didn’t have this by the time the Fall Creators Update emerged, as Betanews reports.

So, judging from the pace of the Fall Creators Update rollout, Microsoft has managed to learn from any errors which slowed the previous upgrade down (and made it something of a painful waiting process for some folks).

Bug bashing

Of course, the current imminent Windows 10 April Update has rather stalled before it has even begun, thanks to the emergence of an apparently nasty bug, and subsequent efforts to fix this.

Although, perhaps Microsoft has learned from previous updates in this respect too. It’s better to take the time to fully hone an update, rather than rushing to release it and running the risk of having another Anniversary Update debacle (that particular effort was famously hit by numerous bugs).

It's now rumored that the April Update will be released in May – which, of course, would be a very good reason not to change the name from Spring Creators Update (the previously alleged moniker).

We’ve chosen the best laptops of 2018


4/28/2018 09:34:00 PM

Google Cloud Platform adds more managed database services

Google Cloud Platform is covering its managed database services bases as it aims to cover more large enterprise use cases.


Google Cloud Platform is rounding out its stable of managed database services as it on boards more large enterprises.

Managed database services are increasingly popular as enterprises aim to abstract the underlying infrastructure and connect with databases via application programming interfaces.

Dominic Preuss, director of product management at Google Cloud, said that the latest additions to the database roster cover the four largest asks from enterprise customers.

Top cloud providers 2018: How AWS, Microsoft, Google, IBM, Oracle, Alibaba stack up | We found 22 cloud services your business definitely needs to try | Everything you need to know about the cloud, explained | Quick glossary: Hybrid cloud

"Every enterprise has many database technologies as well as programming languages. These companies are replatforming on more managed services," said Preuss. "We are laser focused on enterprise use cases."


Managed database services are offered by rivals Amazon Web Services, which has an extensive lineup, as well as Microsoft Azure, IBM Cloud and a bevy of others.


The managed database additions include:

  1. Commit timestamps for Cloud Spanner across multiple regions. The commit timestamps lets enterprises determine the ordering of mutations and build change logs.
  2. Cloud Bigtable replication beta is rolling out and will be available to all customers by May 1. A replicated Cloud Bigtable database provides more availability by enabling use across zones in a region.
  3. Cloud Memorystore for Redis beta. On May 9, Google Cloud will offer a Redis managed service. Preuss noted that Redis has become a popular enterprise option for moving apps to in-memory architectures.
  4. Cloud SQL for PostgreSQL, which is now generally available. Preuss said that Google added more availability, replication and performance to PostgreSQL, which has 99.95 percent availability.


Preuss said that Google Cloud Platform chose those aforementioned database services due to requests by large enterprises, its own services unit and systems integrators. He added that Google Cloud will continue to add database managed services.

"There are other areas we're investigating," said Preuss. "Whatever enterprises are asking for we will go build. This extension gets us to the majority of use cases."


4/28/2018 07:26:00 PM

Gaming has shed its geeky image to become a ‘cool’ hobby

Folks aren’t embarrassed to be called a ‘gamer’ these days


A new survey on gaming carried out by Dell (the firm which makes Alienware machines) has found that the pastime is no longer the domain of geeks, and is now considered to be a ‘cool’ pursuit – and one that can develop useful real-life skills.

Gone are the days when the gamer was typically considered to be a loner teenager locked in his or her bedroom, and the very label ‘gamer’ was tied closely to the concept of geekery.

The global survey (which encompassed nearly 6,000 video game players across 11 countries) showed that gamers are now your co-workers, siblings or friends and that fewer than 10% of respondents felt embarrassed, judged or guilty of some childish crime to be called a ‘gamer’.

Indeed, the gamer is now considered a positive term, with 35% of respondents equating it to ‘fun’, 29% considering it to be ‘cool’ in terms of a hobby, and 26% labeling it ‘exciting’.

The popularity of esports and social media is helping to spread the word about how enjoyable gaming can be, and gamers are now less shy about sharing their passion with others, with a quarter of respondents have introduced five or more people to the hobby.

Developing diversity

Diversity is also becoming a stronger suit for gamers, with the research pointing to a sharp increase in the numbers of female players in recent times – 47% of respondents had a female friend who games.

And, 86% of those surveyed said the gender of those who they were matched up against in online play was of no consequence (you may well expect this to be higher, of course, but let’s face it, there’s always a toxic element when it comes to online gaming).

When it came to online match-ups, the principal concern was the opponent’s skill level, with 40% of respondents saying this was important. The ethnicity, political views and sexual orientation of opponents were inconsequential to most folks, as you would hope, only being a concern to 8%, 7%, and 6% respectively (the toxic minority strikes again).

Another interesting point is that gaming isn’t just regarded as fun, but also a way of honing skills, with 37% believing it improved their hand-eye coordination, and 36% thinking it made their reaction times better.

Gaming isn’t just about reactions and twitch skills, though, and can also sharpen the mind. In fact, the largest percentage – 39% – said it made them more strategic thinkers, and 27% said it enhanced their teamwork skills.


4/28/2018 04:23:00 PM

Best Xiaomi Mi 6X features: Is this what the Mi A2 will look like?

Here's what we liked


Xiaomi has announced the long-awaited Mi 6X smartphone in China on Wednesday. It is the successor to the Mi 5X aka Mi A1 in India.

With the Mi 6X, Xiaomi has differentiated itself from the Redmi series without increasing the price of the phone. It has a different design and forms factor with a smaller battery, a different camera and the latest version of Android.

In China, the Mi 6X has three RAM variants— 4GB/32GB, 4GB/64GB and 6GB/128GB. Price starts at CNY 1,599 (approx. Rs. 16,900) for the base variant, while the 4GB RAM and 64GB storage option costs CNY 1,799 (approx. Rs. 19,000). The high-end variant has a CNY 1,999 (around Rs. 21,000) price tag. It will go on sale in China starting 10am local time on Friday, April 27.

The Mi 6X is made for China, but the phone may make it to India with some changes as the successor to the Mi A1. At most, we can expect the Android One certification, as it was very well-received by Indian users. 

Until then, let us take a look at the best features of the Mi 6X.

AI-powered dual-camera 

One of the major upgrades this time is the camera, now it has AI integration to enhance photo reproduction quality. The camera looks solid for low-light pictures on paper, it has a 20MP Sony IMX376 sensor with f/1.75 aperture and fixed focal length. This means the Mi 6X/A2 focuses on selfies too. 

The back of the phone has a 12MP primary camera with a Sony IMX486 sensor of f/1.75 aperture and 1.25-micron pixel size. The secondary camera is exactly the same as the front camera sensor but with 1-micron pixel size.

While the setup seems promising on paper, it is further backed by AI scene recognition for enhancing colors, and to reproduce natural looking portrait pictures. To recall, the Mi A1 was rated the best sub 15K smartphone camera by us, and we expect the same with this one.

Up-to-date chipset

The Mi 6X runs Qualcomm's Snapdragon 660 octa-core chipset (4x2.2GHz Kyro 260 cores + 4x1.8GHz Kyro 260 cores) at its heart. It’s the same chipset that powers the recently launched Nokia 7 plus and has done really well in our tests. On the basis of our experience with the Nokia 7 plus, the 6GB RAM variant of the phone is likely to offer smooth performance.

Sleek design

The Mi 6X retains the design language of its predecessor, but there have been a few additions on top. Like the Redmi Note 5 Pro and new Mi Mix 2S, the Mi 6X also has a vertical iPhone X-like dual-camera design.

At 7.3mm the Mi 6X looks sleek and shares its resemblance with the Mi A1 from the back. But the front has seen major changes, the 5.5-inch 16:9 display has been upgraded to a new 18:9 FHD+ display measuring 5.99-inches. It results in minimal bezels and more screen space. 

Connectivity and sensors

It is priced affordably, but the Mi 6X doesn’t compromise on connectivity options and sensors. It has all the basic connectivity options like 4G LTE, dual-band Wi-Fi a/b/g/n/ac, Wi-Fi Direct, Miracast, Bluetooth 5.0, IR emitter and a USB Type-C port. 

To recall, the Mi A1 was the only phone under this price range to have a USB Type-C port, while phones like Redmi Note 5 Pro, Zenfone Max Pro M1, and Honor 9 Lite still use a micro USB port. 

The phone, aside from the complete set of connectivity options, packs all the basic sensors too. It has an accelerometer, ambient light sensor, gyroscope, proximity sensor and fingerprint sensor.

Latest software

Unlike the Redmi Note phones from Xiaomi, the Mi 6X ships with Android 8.1 Oreo out of the box. The Chinese variant will have an MIUI 9.5 skin atop. As already mentioned, if it debuts in India, it’s most likely going to run stock Android.

What’s more?

The smartphone has a 3010mAh battery, with Quick Charge 3.0 support. Notably, the 3.5mm jack is absent, but we assume the company will provide a converter in the box. Still, it might be a deal breaker for some.

It has face unlock and also has the AI-powered translation, that can convert text from Chinese languages to English, French, German, Spanish, Japanese, Korean and Indian languages.


4/28/2018 01:25:00 PM

Finding the data buried in cloud storage

With cloud object stores becoming the de facto data lakes, a recent survey shows that enterprises are between a rock and a hard place when it comes to finding and accounting for all the data that is piling up.


It's human nature for messes to spread across all empty spaces. We pointed out a trend several months back that for a growing cross-section of enterprises, cloud object storage is becoming the de facto data lake. The good news is that cloud object storage is relatively cheap and highly scalable, and increasingly, accessible. For instance, most cloud Hadoop services swap in object storage for HDFS, and increasingly, cloud providers are delivering services that provide ad-hoc query or treat cloud object stores as extended tables for data warehouses.

The flip side of relying on cloud storage as the default target or data lake is the need to reconcile the accumulation of data in a general-purpose target with the need to become more accountable for data privacy or data protection, especially with regulations such as GDPR taking effect.

Chaos Sumo, a company that plans to introduce a search layer for SaaS providers to add atop cloud storage (for now, Amazon S3) in the summer, has just released a survey showing some of the pain points that cloud adopters are feeling.

Admittedly, at 120 respondents, the survey size was modest. And targeted at data ops professionals, the sample was likely skewed towards organizations already embracing the cloud. For instance, 72% indicated that they use some form of cloud object storage today. For those using Amazon S3, 40% of respondents stated they expected that their use of S3 storage would grow at least 50% in the next year.

For enterprises, the primary use was for backup, storage, and archiving. But 28% are already using object storage for data lakes, while another 18%% plan to implement one over the next 12- 18 months. Not surprisingly, for this AWS-dominated sample, a similar proportion (23%) reported using Amazon Athena today. Roughly half use the Amazon Redshift data warehouse, where with Spectrum, can now treat S3 as an extended table.

The innovation of tools such as Athena is opening up interactive access to data from a system otherwise optimized for storage, without the need for ETL (although the data must be in some form of semi-structured storage, such as CSV, JSON, Parquet or other formats).


But as the chart shows, as the data is pilling up in object storage, a growing minority is concerned about accountability. That has been the advantage of commercial distributions of platforms such as Hadoop and packaged tooling for analytics and data preparation, which feature some form of data lineage, security, and access control as their raison d'etre. By comparison, cloud object stores are naked when it comes to governance or perimeter security -- that has traditionally been the job of the data platform, cloud host, or analytic tool that consumes the data.

So a quarter of the sample is concerned that they will have to move data to analyze it, while smaller, but statistically significant minorities are voicing concern about finding the data, compliance, and security. They are spending significant time cleaning and preparing data -- well over half report spending at least six hours per week, with nearly 40% of respondents stating devoting over 11 hours per week at the task (those are results that the data prep companies would eat up).

Significantly, only 7% of the sample reported that it is currently easy to analyze data squirreled away in object storage today. That's where the commercial for the survey sponsor, Chaos Sumo, comes in. The company plans to introduce what it terms a "data fabric" that will open S3 data to Elasticsearch by summer for OEM use by existing SaaS providers. We expect S3 to become a sweet spot for more analytic platforms and tools. For Chaos Sumo, adding search as a utility for SaaS providers to make this data more visible will be yet another step toward taming the cloud storage beast.


Friday, April 27, 2018

4/27/2018 10:15:00 PM

Google is giving its G Suite a design overhaul with security, collaboration and artificial intelligence enhancements.

These additions are designed to make G Suite more collaborative and address enterprise security needs. The overhaul also better integrates various G Suite apps.


Everything you need to know about Gmail's new features (TechRepublic)

How do you get the G Suite updates? Google said the new Gmail experience will roll out to businesses today for companies in the G Suite Early Adopter Program via the admin console. Personal Gmail users can opt-in via settings and selecting "try the new Gmail."

This overhaul is notable since G Suite is increasingly competing in a crowded collaboration, smart office, and productivity space. While Microsoft Office, which is integrating Teams functions, is the obvious rival, upstarts such as Slack and Atlassian are also playing in the space. Google, however, has a strong foothold via Gmail as other rivals are trying to nuke email as a communications medium.

Add it up and G Suite is part of a big field trying to make productivity a bit easier. (See also: Survey confirms collaboration and the apps that come with it still suck | Slack engineers go head-to-head with Microsoft and Google)

Here's the rundown beyond the new look:

Gmail is taking aim at what Google calls Business Email Compromise threats. Last month Google added phishing defenses and now it is adding a confidential mode that allows emails to expire or be revoked. You can also add two-factor authentication.


  1. Google also redesigned its security warnings inside of Gmail to make them more explicit as well as controls to remove the option to forward, copy, download or print messages.
  2. Gmail gets more artificial intelligence via features like Nudging, Smart Reply and high-priority notifications. High-priority notifications also curbs interruptions and make recommendations on what to unsubscribe to.
  3. Tasks gets an overhaul and integration with calendar. Tasks will be available in a side panel in G Suite apps.


Google has conducted some research to justify the G Suite ROI. Some fun facts include:

  1. Snooze can save users up to 100 million opens a month.
  2. Nudging prevents 8 percent of business users from dropping the ball on email a week.
  3. High priority notifications cut notifications by 50 percent on average.
  4. Business users will spend about 60,000 hours less in email each day do to hover actions.


4/27/2018 09:19:00 PM

Intel reports strong Q1, accelerates growth in data center business

CEO Brian Krzanich noted that Intel's data-centric businesses accounted for almost half of Q1 revenue.


Intel delivered strong first quarter financial results Thursday led by growth in its data center Internet of things businesses.

The tech giant reported a net income of $4.5 billion, or 93 cents per share.

Non-GAAP earnings were 87 cents per share on revenue of $16.1 billion, up 13 percent from the year prior.

Wall Street was looking for earnings of 72 cents per share with $15.1 billion in revenue.

In prepared remarks, CEO Brian Krzanich noted that Intel's data-centric businesses accounted for almost half of Q1 revenue.

On a conference call with analysts, Krzanich said:

What we're seeing is an unrelenting demand for computing performance driven by the continuing growth of data and the need to process, analyze, store and share that data. That dynamic benefit our traditional CPU business and it reinforces the big bets we've made in memory, modem, FPGAs and autonomous vehicles. We're competing to win in our largest collection of the addressable market ever.
Looking closer at the numbers, Intel's data center unit rang up $5.2 billion in revenue during the quarter, up 24 percent year over year. The Internet of things business grew at a rate of 17 percent year-over-year to deliver $840 million in revenue.

Meanwhile, Intel's client computing group posted revenue of $8.2 billion, up three percent from the previous year.

"Compared to the first-quarter expectations we set in January, revenue was higher, operating margins were stronger and EPS was better," said Intel CFO Bob Swan. "Our data-centric strategy is accelerating Intel's transformation."



Swan said Intel is raising its full-year revenue outlook based on the strong quarter. The company now expects revenue of $65 billion and earnings of $3.55 per share.

For the current quarter, Wall Street is looking for non-GAAP earnings of 81 cents per share with $15.55 billion in revenue. Intel responded well above target with second-quarter EPS of 85 cents and revenue guidance of $16.3 billion.

Robert Swan, CFO of Intel, was upbeat following the outlook. He said:

We believe 2018 will be another record year for Intel. We've met and exceeded our financial commitments, and we feel great about where we are relative to our 3-year plan. Our PC-centric team keeps winning in a challenging market, and our data-centric businesses are growing fast, fueling Intel's transformation to a company that powers the cloud in smart connected devices.

SOURCE BY

4/27/2018 07:13:00 PM

Why SATA flash drives are being left in the dust

The SATA interface on SSDs was a marriage of convenience, not love. Flash SSDs are so fast that SATA simply can't keep up. But NVMe can, and that is changing how systems are configured and infrastructures are designed.



I noted years ago that if the flash had been available in 1956 - the year the first disk drive shipped - we would never be building SSDs with SATA interfaces today. SATA works well for disks because disks are slow.

SATA responded by upping their data rates from a raw 3 Gb/sec to 6 Gb/sec. But bandwidth wasn't the problem - IOPS and latency were.

While drive vendors will continue to produce SATA SSDs for cost-sensitive users, the real action today is in NVMe SSDs. These are extremely high-performance SSDs that provide hundreds of thousands of IOPS, gigabytes per second, and extremely low latency. Performance that only a few years ago required $100,000++ storage array.

CASE STUDIES

Most customers are loath to share the secrets of their configuration choices, rightfully considering them trade secrets in an intensely competitive environment. But I recently spoke to a Kingston Technology senior systems engineer, Cameron Crandall, about customer deployments that showed how NVMe SSDs are changing how people configure systems and manage their workflows.

Kingston is the storage industry's best-kept secret. They are a privately held, $6.6 billion US company. Started by a couple of engineers in the late 80s, they focus on well-engineered products rather than marketing. Which may explain why they have a 59 percent add-on server memory share - not counting their OEM business. Color me impressed.

Kingston also makes NVMe SSDs. That's where my discussion with Cameron came in.

LARGE WEB HOST

Cameron described a large web hosting company that, for performance reasons, used an internal PCIe RAID controller on web caching servers. Behind the RAID controller, they had configured four SATA SSDs to achieve their capacity and bandwidth goals.

While it worked, that configuration was a setup and management hassle. Five components meant more frequent system slowdowns - when drives failed - and maintenance headaches.

Replacing that with a single NVMe SSD, they were able to remove the RAID controller and four SATA SSDs from every configuration. That change alone meant much less need to open servers.

Not only did the customer get a much simpler configuration, but they also got higher performance and lower latency. Since they replaced five devices with one, a Kingston DCP1000, with an 800,000 hour MTBF, and up to 3.2TB of capacity, their subsystem reliability is considerably better, improving uptime.

HOLLYWOOD STUDIO

While 4K is all the rage in home video, Hollywood is starting to transition to shooting in 8k to help future-proof their content. But 8k video files are 4x the size of the 4K video, which creates a serious problem for the digital techs charged with preserving, moving, and sharing these files.

One large studio Cameron works with faced the issue of moving 2-3 Terabytes of 8k data between the production site and the post-production facility, twice a day! No network could handle this fast enough, so the studio opted to copy the data to transportable NVMe/PCIe drives.

The studio estimates that using the NVMe drives saves the camera operator about an hour a day, waiting for the data to copy. But the big win is that at least 10 people are waiting to access the footage, and getting it an hour sooner translates to real savings for the production, as well as the faster turnaround.

So these NVMe drives have replaced portable storage arrays, and networks - even 10Gb Ethernet - for moving data rapidly. Note, the studio used prototype drives from Kingston for testing, but we can expect to see them generally available later this year.

THE STORAGE BITS TAKE

System configuration is a constant game of one-upmanship between CPU, interconnects, and storage. Faster CPUs - which have been thin on the ground lately - chew up data faster, requiring more bandwidth and IOPS.

Faster interconnects, such as 12 Gb/sec SAS, provide more bandwidth, but once the storage can provide lots of IOPS, bandwidth becomes secondary to latency for most workloads. It is an ever-evolving merry-go-round - and good news for everyone who relies on computers.

NVMe SSDs are already making a mark in prosumer notebooks, where Apple has been a leader. But SATA's low cost will keep it in the game for price-sensitive users.

With the continued growth of edge computing, the real win is in the cloud, whose massive warehouse-sized computers make out smartphones smart, and whose smooth operation and high-performance pay dividends for billions of users.