Breaking

Wednesday, February 28, 2018

2/28/2018 10:55:00 PM

Amazon boosts its home security portfolio with acquisition of Ring

It's the second such acquisition in a few short months for the online retailer.



Ring, a company is best known for its video doorbell, has been acquired by Amazon, ZDNet can confirm.

The acquisition was first reported by Geekwire. A formal announcement isn't expected at this time.

According to Reuters, Amazon paid over $1 billion for the company.

"Ring is committed to our mission to reduce crime in neighborhoods by providing effective yet affordable home security tools to our Neighbors that make a positive impact on our homes, our communities, and the world," A Ring spokesperson said in a statement to ZDNet.

The spokesperson added: "We'll be able to achieve even more by partnering with an inventive, customer-centric company like Amazon. We look forward to being a part of the Amazon team as we work toward our vision for safer neighborhoods."

The purchase for Amazon follows a purchase of another home security company, Blink. It also comes after the company introduced its first home security camera, Amazon Cloud Cam.

"Ring's home security products and services have delighted customers since day one. We're excited to work with this talented team and help them in their mission to keep homes safe and secure," an Amazon spokesperson said in a statement to ZDNet.

Ring's product lineup includes several different cameras, with various power options. In January, the company announced its complete home security system, Ring Alarm, was close to shipping after a lengthy delay. The company also announced it had acquired Mr. Beams at that time.

Amazon's interest in home security makes sense as it looks to expand the reach and potential of its smart speaker and home automation lineup.




2/28/2018 10:14:00 PM

MWC: Panasonic launches Toughpad FZ-M1 Thermal Imaging Solution

Panasonic's 7-inch Toughpad FZ-M1 gets an integrated FLIR Lepton Thermal Camera, equipping it for a range of vertical applications in the field.



Panasonic has added a thermal imaging camera to its 7-inch 'fully rugged' Toughpad FZ-M1 Windows tablet. With a thermal camera onboard, field workers in industries such as automotive or building maintenance will be able to record, process and document temperature measurements on their main work device.

The integrated thermal camera comes courtesy of market leader FLIR, which announced its 'Thermal by FLIR' program and the first four partners -- ARSENZ, Casio, Panasonic, and TinkerForge -- at CES in January.

The FLIR Lepton micro-thermal camera integrated into the Toughpad FZ-M1 delivers 160 by 120 thermal resolution and has an imaging temperature range of minus 20 degrees up to 400 degrees Celsius, plus or minus five degrees.

The thermal imaging application is designed by Panasonic and offers Standard and Pro options. Standard mode supports thermal snapshots, changing the thermal pallette, a thermal pointer, and temperature in centigrade or Fahrenheit. In Pro mode, you can also pinpoint maximum and minimum temperature, record thermal video and add metadata such as QR codes to images.

The fanless Toughpad FZ-M1 tablet runs Windows 10 Pro on 6th-generation Intel Core or Atom Core processors. Its 3,200mAh battery is rated for 8-hours and can be hot swapped. The 1280-by-800 IPS touchscreen is sunlight readable and can be operated with a stylus and when wearing gloves.

Configuration options include a barcode reader, serial port, 4G connectivity, and SmartCard. The fully rugged tablet can withstand drops from heights up to 1.8m and is IP65 rated for dust and water resistance.

The Panasonic Toughpad FZ-M1 Thermal Imaging solution will be available from the end of February. The thermal camera option will be available later in the year on other Panasonic products.





2/28/2018 08:10:00 PM

Fitbit Ionic: Adidas Edition announced with coaching features for improved running performance

The Fitbit Ionic is a powerful activity tracker and fitness watch. The new special edition adds six on-screen workouts through the Adidas Train app.



Last year Fitbit and Adidas announced a partnership and today we see the fruits of that agreement with the upcoming Fitbit Ionic: Adidas Edition that will be available on 19 March 2018 for $329.95.

This new Adidas Edition Fitbit features a unique coaching experience through the Adidas Train app that includes six on-screen workouts. The Adidas Edition also has an exclusive two-toned breathable sports band in Ink Blue and Ice Gray with Silver Gray aluminum case, along with a custom Adidas-designed watch face inspired by the iconic race bib; available in four colors. These functions and this color band will not be available for the existing Fitbit Ionic.

The Adidas Train app provides step-by-step coaching to guide you through each series of movements to ensure you are doing it correctly. These workouts include:
  • Dynamic Warm Up to increase your core temperature and get your body ready to work (5 min.)
  • Power Pace to train your body to be more elastic, forceful, and efficient (10 min.)
  • Metabolic to increase your speed and boost your metabolism (15 min.)
  • Run Activation to improve your hip, core and shoulder stability (5 min.)
  • Strong Strides to build strength throughout your run (10 min.)
  • Post-Run Stretch to ensure proper recovery with a fast and easy cool-down stretch (5 min.)


While the Fitbit Ionic, see our full review, can't match the Apple Watch as a smartwatch it offers an advanced activity tracking and GPS sports watch experience. Given that fitness is one of the most used aspects of the Apple Watch, the Fitbit Ionic is a good alternative for those looking for something with a multi-day battery life and enhanced sleep tracking.

It has improved with updates and will continue to improve as developers continue to provide new apps and watch faces.




2/28/2018 04:36:00 PM

Does Fitbit have time to pull off its digital healthcare transformation?

Fitbit's grand plan to pivot to software, data, and services in the healthcare industry makes total sense. The big question is whether it'll have the time to pull it off using its device business to fund a business model pivot.



Fitbit is embarking on a multi-year business transformation to pivot to software, healthcare services and data and analysts, but has to keep its wearable and smartwatch business humming long enough to fund it.

The ongoing debate around Fitbit will revolve around whether the company has the time to execute its business model shift.

Fitbit had a rough fourth quarter and its first-quarter outlook was even worse. Fitbit reported a fourth-quarter net loss of $46 million, or 19 cents a share, on revenue of $571 million. Non-GAAP loss per share in the fourth quarter was 2 cents a share.

As for the first quarter, Fitbit said it would see limited revenue from new products and a sales decline of 15 percent to 20 percent to $240 million to $255 million and a non-GAAP loss of 18 cents a share to 21 cents a share. Fitbit sees 2018 revenue of about $1.5 billion and plans to cut expenses another 7 percent.

Anyone watching Fitbit for a while knows the challenges and opportunities. The breakdown goes like this:

CHALLENGES
  • Wearable trackers are losing favor to smartwatches
  • Fitbit has to compete with much larger rivals in smartwatches (Apple, Samsung, Garmin etc.) and is unproven..
  • Fitbit has some scale, but not enough and device shipments are lumpy. Simply put, Fitbit is no Apple.
  • The company has been cutting expenses and trying to be more efficient in manufacturing.


OPPORTUNITIES

  • Fitbit still has a decent-sized war chest and executives know they need to keep their powder dry.
  • It has been acquiring companies that put it more into the health care system and plans more.
  • Fitbit has a ton of data it can monetize.
  • The company is cutting expenses but leaving research and development clear.
  • Is this Fitbit turnaround sustainable? Here's a look at Fitbit's device fluctuations dating back to 2013.



And the cash ebb and flow.



On a conference call with analysts, the main chore for Fitbit executives was to convince Wall Street it not only had a strong transformation plan but the time to actually deliver.

CEO James Park said the company sold 15.3 million devices in 2017, boosted its active community of users to 25.4 million and delivered foundational assets for a line of new smartwatches. Park then delivered a caveat.

While I'm optimistic about our progress, there's still a lot of work to do. We expect it to be a multiyear transition process, and we'll leverage our core assets to grant community and data a focus on 4 key areas: Adapting to the changing wearable device market, deepening our reach into healthcare, increasing our agility and optimizing our cost structure and transforming our business from an episodic-driven model centered around device sales to more recurring non-device revenue.

Park remained optimistic about Fitbit's pivot to the healthcare industry. Indeed, Fitbit is participating in research, has partnerships with key players such as United Healthcare and the National Institute of Health and using acquisitions to better support patients. Park added:

Data is valuable to payers for the actuarial assumptions and can be the basis for financial incentives to consumers. It can also be leveraged on the identified basis by research institutions and platforms. Data can also be utilized to pioneer new digital therapeutics, effectively utilize software as a medical device. Or data can be utilized to provide AI-driven insights and reminders that help people understand the impacts of their actions on their health with the goal of making positive behavior changes on a daily basis.
To fund this pivot, Fitbit needs its device act together and hope it can retain customers against the likes of Apple. Park explained how Fitbit used to be focused on units, but that led to higher write-downs and scrap costs.

Our goal is to drive incremental margins on a device side of the business by redeploying capital to Fitbit health solutions and recurring revenue opportunities. Lastly, we see a real opportunity to transition our business from an episodic-driven model centered around device sales to more of a lifetime value approach in recurring non-device sources revenue.
Fitbit has the strong share for wearables but has committed to a family of smartwatches. Why? The company needs more to sell and the Ionic was more of a performance watch. Fitbit is aiming for the more mass market with its next round of smartwatches. On the fitness tracker side of the equation, Fitbit plans to consolidate its lineup.

Add it up and Fitbit's 2018 will be another game of wait-and-see. Wait to see whether Fitbit can maintain its cash position. Wait to see how the healthcare partnerships and recurring revenue stack up. And see if Fitbit has enough flexibility and patience from Wall Street. Fitbit's transformation plan is solid, but the company may not have multiple years to deliver.

Bottom line: Fitbit could be a nice acquisition target--especially if it makes real traction into the healthcare system.



2/28/2018 04:09:00 AM

Red Hat introduces updated decision management platform

Business process management doesn't have to be a pain in the right platform.



Troubleshoot a network? No problem. Write a 3,000-word article on Kubernetes cloud container management? When do you want it? Talk to a few hundred people about Linux's history? Been there, done that. Manage a business's delivery routing and shift schedule? I'll break out in a cold sweat.

If you too find the nuts and bolts of business processing management a nightmare, you'll want to check out Red Hat's latest program: Red Hat Decision Manager 7.

This program is the next generation of JBoss business rules management system (BRMS). This is a scalable, open-source business rules management system. It includes both business resource planning and complex event processing (CEP) technology.

By helping your organization or business capture your business logic, it enables you to automate business decisions across heterogeneous physical, virtual, mobile, and cloud environments using a modern microservices architecture. The Decision Manager 7 is fully compatible with Red Hat's Middleware portfolio and Red Hat OpenShift Container Platform so you can deploy it in hybrid cloud environments.

Tools such as this often require a lot of customizing coding before they're useful. This is a low-code development tool, which enables business users to work smoothly with the application development team. If you think of it as a DevOps tool for management and developers, you won't be far wrong.

There's a real need for such programs. According to industry analyst firm IDC, non-traditional developers are expected to build 20 percent of business applications and 30 percent of new application features by 2021. If we want to avoid creating useless business process programs -- and boy haven't we all seen some of those! -- Decision Manager could be quite useful.

According to Mike Piech, Red Hat's VP and general manager of Middleware, "The notion of low-code development is less about eliminating code or cutting traditional programmers out of the application development process, and more about helping business and IT users to do what they need to do quickly and efficiently, and in a complementary manner. Ultimately, what low-code tools should offer -- and what we have built with Red Hat Decision Manager -- is not a platform geared toward one or the other, but rather a rich and tightly integrated feature set designed to provide a better user experience regardless of whether you are a business analyst or hardcore developer."

Red Hat built this platform for both traditional and cloud-native applications. It can create rules-based decision and be planning microservices that can be deployed on-premises within a customer's datacenter, or as containerized services on Red Hat OpenShift Container Platform.

OpenShift, an OpenStack and Docker cloud-based technology -- what does that have to do with business processes, you ask. Remember what I said about DevOps? It enables your business to enhance your processes with such DevOps tricks in the trade as automated testing and continuous integration and delivery (CI/CD).

Companies want business process management (BPM). A Red Hat survey found over half of Red Hat customers, 57 percent, want BPM software to automate internal processes. Others, 46 percent, want it to help support new applications, while 41 percent want it to automate external processes, e.g., processes that touch customers, partners, or suppliers. Finally, a substantial minority, 29 percent, want it to support self-service applications.

Want to give Red Hat Decision Manager a try? Red Hat Decision Manager is available for download by members of the Red Hat Developers community. Customers can get the latest updates from the Red Hat Customer Portal. Just don't ask me to work out your business processes before you try to automate them. I have enough trouble organizing my small business workflow.




2/28/2018 01:07:00 AM

AWS makes Serverless Application Repository generally available




It's designed for producers and consumers of serverless apps, supporting publishing, discovery, and deployment.

Amazon Web Services (AWS) on Wednesday announced the general availability of its AWS Serverless Application Repository that enables users to find and deploy serverless applications and components. It was previously available in a public preview.

The Serverless Application Repository can be accessed from the Lambda Console.

For consumers, serverless applications and components can be configured and deployed from the repository to an AWS account as-is. They can also add features and submit pull requests to the author.

For publishers, a name, description, labels, and an open source license are chosen while submitting a contribution. Publishers then supply a link to existing source code repo, a SAM template, and designate a semantic version.

Amazon said applications can be deployed in the US East (Ohio), US East (N. Virginia), US West (N. California), US West (Oregon), Asia Pacific (Tokyo), Asia Pacific (Seoul), Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), Canada (Central), EU (Frankfurt), EU (Ireland), EU (London), and South America (São Paulo) regions.

For global availability, you can publish from the US East (N. Virginia) or US East (Ohio) regions.




Tuesday, February 27, 2018

2/27/2018 10:43:00 PM

Western Digital unveils world's fastest UHS-I flash memory card


Western Digital has boosted the performance of its SanDisk Extreme UHS-I microSD card line up with a new 400GB SanDisk Extreme UHS-I microSDXCT card, the world's fastest UHS-I flash memory card.

The card, unveiled at MWC 2018, offers read speeds of up to 160MB/s, write speeds of up to 90MB/s, and the card conforms to the Application Performance Class 2 (A2) specification, which demands a minimum random read speeds of 4,000IOPS, a minimum random write of 2,000IOPS, and a minimum sustained read speed of 10MB/s in order to facilitate fast application launching.

As is the case with other SanDisk Extreme cards, this new addition is shockproof, waterproof, X-ray-proof, and good to for a wide range of temperatures (operating temperatures between -13ºF to 185ºF/-25ºC to 85ºC, and storage temperatures between -40ºF to 185ºF/-40ºC to 85ºC.

The 400GB SanDisk Extreme UHS-I microSDXCT card is aimed at the Android smartphone market, as well as the ever-growing action camera and drone markets.

To protect against data loss, the card also gives the owner a copy of RescuePRO Deluxe data recovery software.

Western Digital is also using MWC 2018 to demonstrate PCIe-enabled cards, which the company is calling "the future of flash memory card technology." A single-lane of the PCIe Gen 3.0 standard offers speeds of up to 985MB/s, performance that will revolutionize high-resolution content applications such as super-slow-motion video, RAW continuous burst mode and 8K video capture and playback.

The company also unveiled two new NVMe 3D NAND SSDs -- the PC SN720, available in M.2 2280, 2242, and 2230 form factors and capacities of 128GB to 512GB, and PC SN520, which is available in the M.2 2280 form factor and capacities between 256GB and 1TB - both of which feature nCache 3.0 architecture aimed specifically at delivering performance and endurance.

"With this new vertically integrated SSD platform, we are able to optimize the architecture to our NAND for low latency and power efficiency, and most important, for the growing range of applications benefiting from NVMe," said Eyal Bek, Senior Director of Client SSD, Devices Business Unit, Western Digital. "The scalable architecture supports a range of capacity and performance points while streamlining system qualification to improve time-to-market for our customers."



2/27/2018 09:24:00 PM

Flop moderate at scale: When the cloud quits working

PC frameworks come up short. Most disappointments are very much carried on: the framework quits working. Be that as it may, there are awful disappointments as well, where the frameworks works, all things considered, s-l-o-w-l-y. What segments are well on the way to flop moderate? The appropriate responses may astonish you. 


In the event that you've at any point had a framework flop moderate, you know how irritating it is. The lights are on, the fans are running, yet no one is home. Is it programming? A foundation procedure go crazy? 

So you reboot and seek after the best - and possibly that works. If not, more hair pulling. 

Presently envision you have a 100-hub group that all of a sudden eases back to a creep. Reboot 100 hubs? 

No, you have administration programming to take a gander at. Furthermore, nothing. Nothing the product sees that could be the issue group-wide. So you begin taking a gander at every server. At that point the capacity. At that point the system. 

What's more, truly, there's a moderate 1Gb NIC - a $50 board - that is made a falling disappointment that is conveyed your half-million dollar bunch to its knees. 

Bomb SLOW AT SCALE 

At the current month's Usenix File and Storage Technology (FAST '18) gathering in Oakland, California, a group of 23 creators exhibited Fail-Slow at Scale: Evidence of Hardware Performance Faults in Large Production Systems. They portray bomb moderate disappointments at five organizations, four colleges, and three national labs, with hub checks extending from more than 100 to more than 10,000. 

They found that a portion of the deficiencies was changeless until settled, or caused fractional stoppages, while others were transient, which are the hardest to analyze. 

The paper has some preventative stories that are interesting if just by and large. 

. . . one administrator put an office seat neighboring a capacity group. The administrator jumped at the chance to shake in the seat, over and again popping hotplug drives out of the skeleton (a hard relationship to analyze). 

Be that as it may, a large number of the disappointments were more unobtrusive: 

". . . a seller's carriage firmware influenced a bunch of SSDs to stop for quite a long time, impairing the glimmer reserve layer and influencing the whole stockpiling to stack moderate." 

". . . a machine was considered nonfunctional because of substantial ECC revision of numerous DRAM bit-flips." 

". . . terrible chips in SSDs lessen the extent of over-provisioned space, activating more continuous waste gathering." 

". . . applications that make a gigantic load can cause the rack control to convey lacking energy to different machines (corrupting their execution), however, just until the eager for power applications wrap up." 

"A fan in a process hub quit working, influencing different fans to repay the dead fan by working at maximal velocities, which at that point caused a great deal of clamor and vibration that accordingly corrupted the circle execution." 

Normally, finding these issues took at least hours and regular days, weeks, or even months. In one case a whole group of specialists was pulled off an undertaking to analyze a bug, at a cost of countless dollars. 

Underlying drivers 

The paper outlines the reasons for the 101 bomb moderate episodes they examined. System issues were the #1 cause, trailed by CPU, plate, SSD, and memory. A large portion of the system disappointments was lasting, while SSD and CPUs had the most transient mistakes. 

Nor does the underlying driver fundamentally rest with the moderate equipment, as for the situation above where an eager for power application on a few servers made different servers back off. For another situation, the seller couldn't duplicate the client's high-height disappointment mode at their ocean level office. 

THE STORAGE BITS TAKE 

Any sysadmin tormented by stoppages should read this paper. The analyst's scientific classification and illustrations are certain to be useful in extending one's vision of what could be occurring. 

For (one more) illustration, 

In one condition, a fan firmware would not respond rapidly enough when CPU-escalated employments were running, and thus the CPUs entered warm throttle (diminished speed) before the fans had the opportunity to chill off the CPUs. 

With everything taken into account, a captivating abstract of disappointment measurements and sorts. Furthermore, for those of us who don't oversee expansive groups, an appreciated feeling of numerous projectiles avoided. Whew!




2/27/2018 07:41:00 PM

Sony's Xperia XZ2 smartphone adds 3D scanning capabilities

The Sony Xperia XZ2 comes with 3D scanning capabilities on its front-facing camera, S-Force surrounds sound, Bravia screen tech, and a dynamic vibration system adapted from its PlayStation controllers.


Sony Mobile has taken the wraps off its new Xperia XZ2 flagship smartphone and compact version of Mobile World Congress (MWC) 2018 in Barcelona, with the company telling ZDNet it is focused on entertainment.

"Typically, Sony customers understand us certainly not only as an electronic company but an entertainment company. And so for us, the sweet spot, the bullseye, is certainly those Sony customers that understand that about us, and then we'll extend from there," Oceania MD for Sony Mobile Communications John Featherstone told ZDNet.

According to Sony, the Xperia XZ2 takes the best tech from all of the company's consumer products for the new line of smartphones: S-Force surround sound and LDAC wireless high-resolution audio from its audio range; screen technology from its Bravia TVs; photography tech from its Alpha and Cyber-shot cameras; five-way image stabilisation from its Action Cam; and the dynamic vibration system and augmented reality from its PlayStation engine. Also new: a center-aligned fingerprint scanner.

The vibration system causes the Xperia XZ2 to vibrate when playing audio with bass-heavy sounds, which Featherstone said adds the sense of touch to the experience -- much like when watching a trailer in a cinema or listening to music at a concert.

"What we're doing is we're taking the knowledge out of a gaming environment, and you can feel what you're seeing and experiencing as well," Featherstone told ZDNet.

"Whatever entertainment you're watching -- movies, gaming, whatever -- you'll be fully immersed in the moment, and that's what it's about, that's what we're trying to get people to experience.

"There's a number of theoretical world firsts [with this handset] ... but certainly, with that dynamic vibration system, it really starts to become an entertainment device."

The Xperia XZ2 also features a 5.7-inch high dynamic range (HDR) full HD+ 18:9 display; front-facing stereo speakers that make it 20 percent louder than the XZ1; 4K HDR video recording; full HD 960 FPS super slow-motion capture; a Motion Eye camera with smile detection, predictive capture, autofocus burst, and predictive autofocus; a Qualcomm Snapdragon 845 mobile platform for 1.2Gbps 4G speeds; Bravia Xreality tech that up-converts content to near-HDR, improving the quality of non-HDR videos; a 3D glass surface with a metal frame; Corning Gorilla Glass 5; and Android 8.0 Oreo.

Sony has additionally added 3D Creator's scanning capabilities to the front-facing camera so users can create three-dimensional selfies that can be uploaded to Facebook and used for avatar creation -- something that could be extended to Sony's gaming range in future, Featherstone told ZDNet.

The rendering of the 3D scan was previously performed on the handset, with Sony now adding server file processing instead for greater accuracy and detail.

It comes in four colors: Liquid silver, ash pink, deep green, and liquid black.

The XZ2 has a 3180mAh battery with wireless and wired charging; it is Qnovo adaptive charging also improves the device's battery lifespan, preventing the battery from overcharging by charging it to 80 percent when first plugged in, and then completing the remaining 20 percent in the last hour based on usage patterns.

The Compact XZ2 version, meanwhile, comes in a 5-inch model with a non-scratch polycarbonate finish. While it has an HDR full HD+ display, stereo speakers, Motion Eye with 3D Creator, super slow motion, hi-res audio, and front-facing stereo speakers, it does not include the wireless charging or dynamic vibration technologies.

The Compact smartphone is also kitted out with Qualcomm's Snapdragon 845 mobile platform and Corning Gorilla Glass 5 and comes in four colors: White silver, black, moss green, and coral pink.

2/27/2018 04:26:00 PM

Nokia 8110 'Matrix' slider phone is back: This time it's reloaded with 4G

The Nokia 8110 curved slider phone makes a return in bright 'banana yellow'.


HMD is offering the Nokia 8110 in banana yellow and, for those want a more Matrix-like phone, black.

HMD Global, the company with a license to make Nokia phones, has once again gone retro to capture some attention at Mobile World Congress.

Following last year's relaunch of the 3310 candy bar, its 2018 blast from the past is the curved Nokia 8110 slider phone. The phone was launched in 1996 and later made famous by Keanu Reeves' character in the 1999 action flick The Matrix.

Fans of the slider design will be happy to know that you can answer a call by flicking open the device and end a call by sliding it shut. The original GSM Nokia 8110 was marketed as capable of storing 324 contacts and was the first Nokia phone with a monochrome graphics display.

The updated Nokia 8110 is, of course, a 4G phone and comes with a range of apps including Snake, Google Assistant, Google Maps, Google Search, Facebook, and Twitter. Apps run on a smart feature OS based on KaiOS, which apparently contains elements of Mozilla's abandoned Firefox OS.

The 4G Nokia 8110 has a 2.4-inch QVGA curved display, a 1.1GHz Qualcomm 205 processor, 512MB RAM, and 4GB storage. Unlike the updated 3310, there is no MicroSD slot. It also has a two-megapixel camera with LED flash and a removable 1,500mAh battery.

HMD says the phone offers around seven hours' talk time and standby time of up to 25 days. Extra features include an FM radio.

In a nod to the 'banana phone' nickname it earned courtesy of the curved design, HMD is offering the Nokia 8110 in banana yellow and, for those who want a more Matrix-like phone, black.

The phone is expected to retail for €79, or about $100, excluding taxes and subsidies, and will be available from May.

HMD Global debuted the Nokia 8110 alongside four new smartphones, including the Nokia 8 Sirocco, Nokia 7 Plus, Nokia 6 and Nokia 1. They're all part of Google's 'pure and secure' Android One program.

The Nokia 1 is also one of the first Android Oreo Go Edition smartphones. Android Go is optimized for devices with less than 1GB of RAM and comes with a range of Go apps to assist users in areas with patchy connectivity. The Nokia 1 will cost $85 when it goes on sale in April and is available in red or dark blue.

HMD's new flagship, the Nokia 8 Sirocco runs on a Qualcomm Snapdragon 835 and comes with 6GB RAM, 128GB storage, and a 5.5-inch QHD plastic OLED display. It also offers wireless charging and will cost €749 (about $920) when it goes on sale in early April.

According to HMD, the company sold an impressive 70 million Nokia-branded phones last year.



2/27/2018 04:08:00 AM

How the cloud will save - and change - disk drives

Google has changed many aspects of computer infrastructure, including power supplies and scale-out architectures. Now they're asking vendors to redesign disks for cloud use. How will that affect you?



At a past Usenix FAST conference, Eric Brewer, Google's VP of Infrastructure, gave a keynote address, Disks, and their Cloudy Future. For cost reasons Google and the other big internet data centers remain committed to disk drives, but they'd like some changes to make disks cloud-friendly, at the cost of making them less consumer friendly.

The cloud and SSDs have already forced big changes on the disk drives. Mobile computing is mostly SSD-based today, and will only get more so. High-performance drives - the 10k and 15k drives - are still being sold, but new models are not coming out. The industry is changing anyway, so why not listen to Google.

GOOGLE'S PROBLEM

Google's data centers are different from your enterprise. First, they run many tens of thousands of servers with software designed to automatically recover from server failures. Second, they have a high degree of data redundancy, so the failure of a single disk, or a single server, doesn't affect data availability.

Nor do they much care about write latency, because most of their workload is reads. And since the data is spread among multiple servers and drives, they don't care about features that optimize a drive for the server.

Their big concern is long tail read latency. The mean read time is about 10ms, but the 99th percentile is 100ms, and 99.9th percentile is seconds. That's too long.

WHAT GOOGLE WANTS

Google wants lower latency, which is a combination of more IOPS and fewer interruptions with the disk's I/Os. But how do you get there?

Since all the data is replicated, Google doesn't need elaborate methods to achieve <1 in 10 to the negative 15th error rate, which is common on today's server drives. Since those error rates are achieved by heroic retry efforts and lengthy and capacity consuming ECC, increasing the unrecoverable read error rate would reduce latency and increase drive capacity.

Vendors could also dispense with remapping bad blocks and maintaining spare blocks, since Google treats all disks as part of a giant block pool. They don't need all drives to be the same size or any particular size, unlike RAID arrays.

Google would also like disk drives to be smarter about I/O. They label all I/Os as either a) low latency, b) max bandwidth, and c) best effort. They'd like the disk drive to understand those labels and to perform I/Os accordingly since the disk knows best where the data and its heads are.

Also, Google would like specialized APIs so it can manage disks from above the server level. Remember, any server can fail without data loss, so Google needs a way to manage disks as a giant block pool.

THE STORAGE BITS TAKE

Vendors got right on Google's demand for efficient power supplies, and the rest of us benefitted. But their desires for disk drives aren't as good for consumers.

My guess is that drive vendors are working on stripped down firmware that they can put on their high-volume drives, that takes out lots of features that enterprises and consumers alike, such as low error rates and bad block replacement, and adds in the APIs that Google wants.

The name of the disk manufacturing game is volume. If the cloud vendors keep buying high capacity disks, it doesn't matter what version of firmware they run. Us consumers will be able to join in the fun of reliable, high capacity and low-cost disk storage for years to come.




2/27/2018 01:00:00 AM

Microsoft's DNA storage breakthrough could pave way for exabyte drives




The prospect of storing vast amounts of data on DNA has come closer to reality thanks to a new technique for retrieving data.

Microsoft is keen on synthetic DNA as a future long-term archival medium that could solve the world's need for more data storage. Previous research has shown that just a few grams of DNA can store an exabyte of data and keep it intact for up to 2,000 years.

The drawback is that it's expensive and extremely slow to write data to DNA, which involves converting 0s and 1s to the DNA molecules adenine, thymine, cytosine, and guanine while getting data back from DNA involves sequencing it and decoding files back to 0s and 1s. Finding and retrieving specific files stored on DNA is also a challenge.

As scientists from Microsoft Research and the University of Washington explain, without random access or the ability to selectively retrieve files from DNA storage, you'd need to sequence and decode an entire dataset to find and retrieve the few files you want. Creating random access would reduce the amount of sequencing that needed to be done.


Microsoft researcher Yuan-Jyue Chen, background, and Lee Organic work on DNA research at the University of Washington

Image: Dennis Wise, University of Washington
To achieve random access to DNA, they created a library of 'primers' that are attached to each DNA sequence. The primers, together with polymerase chain reaction (PCR), are used as targets to select desired snippets of DNA through random access.

"Before synthesizing the DNA containing data from a file, the researchers appended both ends of each DNA sequence with PCR primer targets from the primer library," the University of Washington explains.

"They then used these primers later to select the desired strands through random access, and used a new algorithm designed to more efficiently decode and restore the data to its original, digital state."

The researchers also developed an algorithm for decoding and restoring data more efficiently. Microsoft senior researcher Sergey Yekhanin said the new algorithms are more tolerant to errors in writing and reading DNA sequences, which cuts the sequencing and processing needed to recover information.

While it's not the first time random access to DNA has been achieved, it's the first time it's been done at the scale they did it, according to the researchers.

The researchers encoded to synthetic DNA a record 200MB of data consisting of 35 files ranging in size from 29kB to 44MB. The files contained high-definition video, audio, images, and text.

Since releasing the paper describing the technique, they've also encoded and retrieved files from 400MB of data on DNA.

The researchers believe the approach they have used for random access will scale to physically isolated pools of DNA containing several terabytes each.




Monday, February 26, 2018

2/26/2018 10:54:00 PM

Head in three mists: ANAO discovers ATO contracts missing administration responsibilities




The Australian Taxation Office (ATO) has by and by gotten itself the focal point of an examination, following a turbulent year and a half of IT-related occurrences and frameworks blackouts tormenting the organization. 

While tests into its physical gear have beforehand been the concentration, the Australian National Audit Office (ANAO) on Tuesday got the tax assessment office out for lacking on the administration responsibility front, especially where cloud is concerned, taking note of a year-old concurrence with Amazon Web Services (AWS) does exclude benefit level arrangements. 

"This agreement opens the ATO to authoritative and operational dangers without quantifiable administration levels," ANAO wrote in its report [PDF], Unscheduled tax assessment framework blackouts. 

In evaluating whether the ATO has successfully reacted to late unscheduled IT framework blackouts, ANAO uncovered the ATO started to set up distributed computing contracts in 2016, now bragging three separate concurrences with Macquarie Telecom for its MacGov cloud since May 2016, and Microsoft's Azure, notwithstanding the AWS get that started in December 2016 - days after the principal blackout brought the ATO's online administrations down. 

The three cloud contracts came eight years after an ICT Sourcing Program prompted contracts for three separate gatherings of administration "groups" for end-client processing, contracted to Leidos; oversaw organize administrations, contracted to Optus; and brought together registering, contracted to DXC Technology - once Hewlett Packard Enterprise (HPE). 

In testing the ATO's IT benefit measures, ANAO discovered just the MacGov contract had appraisal synopses set up - and that was just for two of the four key components ANAO had examined. Where physical pack was concerned, ANAO appeared to be satisfied with the documentation set up. 

"Its three noteworthy groups contracts joined a Performance Framework in their authoritative administration level understandings. Reliable with that structure, the administration measures were for the most part all around determined over the classes of Service markers; benefit observing, and announcing; basic framework expectations; and business appraisals," the report peruses. 

The three package contracts are expected for reestablishment this year, which ANAO said gives the ATO a chance to reassess its IT benefit estimation approach, and where conceivable execute basic methodologies, at any rate as far as "reflecting resilience that line up with the IT blackout benefit benchmarks that the ATO has resolved to create". 

"Such an approach would bolster the ATO in its endeavors to utilize computerized innovation and online administrations successfully and productively in the organization of the tax assessment and superannuation frameworks," it included. 

Of the IT-related occurrences tormenting the tax collection office, there were two huge framework disappointments, with the main happening in December 2016, and a consequent blackout in February 2017 the aftereffect of work to settle the fiber cabling from the first. 

A report from the ATO into the blackouts uncovered the HPE-possessed and worked SAN couldn't deal with in excess of one drive or pen disappointment on account of an outline choice taken by the tech monster. An investigation of logs from the half year before the episode demonstrated various cautions showing issues with the SAN. 

"Since May 2016, no less than 77 occasions identified with segments that were seen to flop in the December 2016 occurrence were signed in our episode determination device," the ATO said already. "We were not made completely mindful of the essentialness of the proceeding with a pattern of alarms, nor the more extensive frameworks impacts that would come about because of the disappointment of the 3PAR SAN." 

The report portrayed HPE's absence of planning for an occasion of the kind experienced by the ATO in December 2016. 

"Recuperation methods for applications in case of an entire SAN blackout had not been characterized or tried by HPE," the ATO said. 

As to non-recognizable proof of SAN dangers, ANAO featured that the framework recuperation apparatuses utilized by the ATO to reestablish its information administration, framework observing, and reinforcement/reestablish frameworks were in the same datacentre, on the influenced SAN. 

"The framework disappointment implied that these devices were inaccessible, and there were no reinforcement or repetitive framework recuperation apparatuses accessible on other ICT frameworks to distinguish and break down the occurrence and to help endeavors to recoup and reestablish administrations," ANAO composed. 

In the second real blackout, an information card was ousted all the while and caused the SAN to carry on in much an indistinguishable way from the December episode. In the two cases, the SAN was not able naturally to reestablish itself and close down to protect information. 

In the February episode, the ATO site stayed up, as it had been gotten off of the SAN and facilitated in a cloud domain. 

Because of the episodes, the ATO remade its stockpiling arrangement with another 3PAR and decommissioned the old one in July for scientific examination. 

"The December 2016 and February 2017 occurrences feature that the ATO did not have an adequate level of comprehension of framework disappointment hazards," ANAO's report included. "The ATO's hazard administration and BCM [business progression management] forms did exclude an appraisal of dangers related with capacity zone systems, which were a potential single purpose of disappointment. Additionally, BCM forms were restricted in anticipating basic framework and ICT framework inability to the datacentres." 

As a result, ANAO said the ATO - including DXC and Leidos - were not set up for the likelihood of finish framework disappointment caused by capacity disappointment. It additionally found the ATO did not have an auxiliary venture framework set up, other than a fiasco recuperation methodology. 

It likewise announced that around then, cloud administrations were considered for execution purposes yet not completely actualized. 

Leidos, ANAO stated, additionally had not distinguished the SANs were a solitary purpose of disappointment. 

ANAO, be that as it may, said the ATO's reactions to the framework disappointments and unscheduled blackouts were "to a great extent compelling", regardless of deficiencies in business coherence administration arranging to identify with a basic foundation. 

Making a sum of three proposals, ANAO has asked the ATO to likewise refresh its BCM, IT benefit progression administration (ITSCM), and hazard administration structures to "enhance and better coordinate the ID and treatment of dangers to the basic foundation that may prompt framework disappointments". 

The last suggestion asks for the administration element "decides the level of accessibility of administrations related with its ICT frameworks to incorporate into benefit standard(s) and in this way reports execution against those standard(s)". 

Following the two noteworthy episodes, the ATO has encountered numerous blackouts and centralized computer reboots, with the latest blackout in September influencing its online administrations. 

In spite of the HPE hardware being at the focal point of the first and a modest bunch of coming about issues, the ATO contracted DXC Technology for the arrangement of a further AU$735 million in "incorporated figuring" in December 2017, bringing the aggregate estimation of the agreement with the tech goliath to AU$1.47 billion.



2/26/2018 09:44:00 PM

Samsung to sell Galaxy S9 super slow-mo image sensors

Samsung's ISOCELL Fast 2L3 image sensor, which powers the Galaxy S9's camera, supports 960 frames per second super slow-motion recording and is likely aimed at Chinese smartphone vendors.



Samsung Electronics has launched the new ISOCELL Fast 2L3 image sensor that supports 960 frames-per-second super slow-motion recording.

It is installed on the new Galaxy S9 and S9 Plus smartphones unveiled at the Mobile World Congress.

The sensor integrates a 2Gb DRAM (LPDDR4) below the conventional pixel array layer and an analog logic layer which allows for high image readouts.

It allows 1/120 second-snapshots and 960 frames-per-second recordings.

The design is similar to that of Japanese tech giant Sony's image sensor with similar capabilities, which also integrates a DRAM.

The 1.4 micrometer, 1.2-megapixel ISOCELL Fast 2L3 also supports HDR, noise reduction, and prevents jello effect and image distortion.

Samsung will likely now compete with Sony to woo Chinese vendors, which except Xiaomi, are yet to adopt slow-motion features in their smartphone cameras.

The Japanese tech giant has dominated in CMOS image sensors thanks to its long legacy of camera technology, an area in which the South Korean tech giant is a fast-rising runner-up.

Earlier this month, Samsung introduced the ISOCELL Dual aimed at providing dual camera features for budget phone clients.





2/26/2018 08:02:00 PM

Samsung Galaxy S9, Galaxy S9 Plus: Should you upgrade based on specs, price and camera improvements?

Samsung is betting that camera improvements will spur an upgrade cycle to the Galaxy S9 and Galaxy S9 Plus. Here's a look at the enterprise angle, specifications, deals, DeX and competitive landscape to help you decide.



Samsung launched its Galaxy S9 and Galaxy S9 Plus at Mobile World Congress in Barcelona, outlined a camera-heavy pitch and updated its flagship smartphones. The next big question is whether the Galaxy S9 and S9 Plus is worth the upgrade.

Here's a look at the moving parts in the buying decision:

THE CORE PITCH

Samsung is looking to redefine the camera experience with the Galaxy S9 and Galaxy S9 Plus. DJ Koh, head of Samsung's IT and mobile communications division, said technology is changing lives, but it only does so when innovation is in the hands of real people. Koh referred to customers as creators. "The most important function of the phone is not making calls. It's to capture that fleeting moment in a photo and sharing moments in videos. People took 1.2 trillion photos on their smartphones," said Koh, who said the camera upgrade is a breakthrough.

Samsung also said its Galaxy S9 and S9 Plus can manage your connected life, but at the end of the day, the core pitch revolves around the camera and its ability to take pictures in low light. The camera has its own DRAM processor as well as dual apertures to handle all lighting conditions.

S9 Plus boasts dual aperture camera, slow-motion capture | CNET: Samsung Galaxy S9 and S9 Plus cameras: Everything you need to know | New camera tricks | Hands-on



And yes, Samsung moved the fingerprint scanner to a better location. New features to Bixby are interesting, but hardly a reason to upgrade. Ditto for the Internet of things hooks into the Smart Things cloud and iris scanner and facial recognition features. In the end, Galaxy S9 and Galaxy S9 Plus will be about camera improvements and a less annoying fingerprint sensor for many folks.

SPECIFICATIONS

Both devices feature the latest Qualcomm Snapdragon processor and that chip is one of the primary reasons Samsung can boast camera improvements. The primary thing to note here is that Samsung's Galaxy S9 Plus has dual cameras and the 5.8-inch version doesn't. The Galaxy S9 Plus joins the Galaxy Note 8 as dual-camera flagship devices. Here's a look at the core specs:

GALAXY S9 SPECIFICATIONS

Processor: Qualcomm Snapdragon 845 octa-core (Exynos outside the US)

Display: 5.8 inch 2960x1440 pixels resolution Super AMOLED

Operating system: Android 8 Oreo

RAM: 4GB

Storage: 64/128/256GB internal with microSD card slot

Cameras: Super speed dual pixel 12 megapixel OIS and mechanical dual aperture (f/1.5 and f/2.4). Front 8-megapixel camera, f/1.7.

Battery: 3000 mAh with fast wired and wireless charging

Water resistance: IP68

Wireless connectivity: 802.11 a/b/g/n/ac WiFi, NFC, Bluetooth 5, ANT+, NFC, GPS, Glonass, Galileo, BeiDou, MST

Audio: Dual stereo speakers tuned by AKG with Dolby Atmos, 3.5mm headset jack

Sensors: Iris, pressure, Accelerometer, barometer, fingerprint, gyro, magnet, heart rate, proximity, RGB light

Dimensions: 147.7 x 68.7 x 8.5 mm and 163 grams

GALAXY S9 PLUS SPECIFICATIONS

Processor: Qualcomm Snapdragon 845 octa-core (Exynos outside the US)

Display: 6.2 inch 2960x1440 pixels resolution Super AMOLED

Operating system: Android 8 Oreo

RAM: 6GB

Storage: 64/128/256GB internal with microSD card slot

Cameras: Super speed dual pixel 12 megapixel OIS and mechanical dual aperture (f/1.5 and f/2.4) and 12-megapixel f/2.4 telephoto camera. Front 8-megapixel camera, f/1.7.

Battery: 3500 mAh with fast wired and wireless charging
Water resistance: IP68

Wireless connectivity: 802.11 a/b/g/n/ac WiFi, NFC, Bluetooth 5, ANT+, NFC, GPS, Glonass, Galileo, BeiDou, MST

Audio: Dual stereo speakers tuned by AKG with Dolby Atmos, 3.5mm headset jack

Sensors: Iris, pressure, Accelerometer, barometer, fingerprint, gyro, magnet, heart rate, proximity, RGB light

Dimensions: 158.1 x 73.8 x 8.5 mm and 189 grams

THE PRICE

Samsung is holding the pricing line with its Galaxy S9 and Galaxy S9 Plus with the Galaxy S8 and S8 Plus. The pricing from Samsung also is less expensive than Apple's iPhone X. As noted in Matthew Miller's roundup the Galaxy S9 from T-Mobile runs $720 full price and the Galaxy S9 Plus costs $840. There are also trade-in promotions. Deals from AT&T and Verizon are also in that price range but are a bit higher. Deals from Best Buy and other retailers are likely to be available. For instance, Xfinity Mobile from Comcast is offering $450 off the Galaxy S9 and Galaxy S9 Plus for a limited time.

IHS Markit analysts Gerrit Schneemann and Wayne Lam noted:

Success in the flagship pricing tier will be critical in 2018 for Samsung. Unless Samsung changes its portfolio strategy, IHS Markit forecasts Samsung's smartphone shipments will decline by 2.6 percent in 2018 compared with overall market growth of 3.9 percent. With the S9 and S9 Plus' iterative designs, Samsung is banking on its successful design language from 2017 to continue to match buyer preferences, while Samsung adds enough new features to justify upgrades.

WHAT'S THE TRADE-IN VALUE OF CURRENT PHONE?

The deals from carriers and retailers for the Galaxy S9 and Galaxy S9 Plus may be sweetened by trade-in offers. Koh even mentioned trade-in promotions during his keynote so Samsung may be revving up the incentives. You can also shop around via various merchants to see what your smartphone is worth and ways to offset the price of a new device. See CNET's smartphone trade-in guide.

WHAT IF YOU HAVE THE GALAXY NOTE 8?

Many of the features in the Galaxy S9 Plus will wind up via software updates in the Galaxy Note 8. The Galaxy S9 Plus may have better camera technology, but other features aren't enough to spur a replacement. Samsung has set up some serious flagship confusion though.

HOW DOES GALAXY S9 AND GALAXY S9 PLUS COMPARE TO IPHONE X, GOOGLE PIXEL 2 AND OTHER LESS EXPENSIVE ANDROID PHONES?

Samsung has dual cameras in the Galaxy S9 and Galaxy S9 Plus and improvements to iris and facial scanning. The price points for Samsung's latest devices will fall below the iPhone X so that's a potential win.

As for the Pixel 2, Google has a single camera but has advanced the technology via software and machine learning.

What's also notable is how Galaxy S9 and S9 Plus will compete with Android rivals such as Huawei, Nokia, LG, and others. Many smartphone makers have gone with a pure Google Android experience and have been using that as a selling point. The Galaxy S9 and Galaxy S9 Plus may compete with phones from the likes of Nokia too. Samsung Galaxy S9: Everything to know

WHAT'S THE BUSINESS PLAY?

As an enterprise device, Samsung has used the Galaxy Note 8 as the business flagship. What changed, however, is that the Galaxy S9 now has an enterprise edition. Samsung is using its smaller Galaxy S9 as the enterprise play and adds customization for enterprise-specific apps and more control for central IT. Samsung's enterprise edition approach doesn't apply to the Galaxy S9 Plus. Samsung puts business spin on Galaxy S9, joins Galaxy Note 8 with enterprise edition

It's also worth noting that Google is launching its Android enterprise program for smartphones and left Samsung out of the launch to start. That omission is something C-level executives may notice.

Samsung Galaxy S9 Enterprise Edition features include:

  • Knox Configure, which enables enterprises to remotely provision and configure devices, and Knox 3.1.
  • Granular device settings.
  • Customizations for industry-specific use cases.
  • The ability to turn the Samsung Galaxy S9 Enterprise Edition to a single purpose appliance.
  • Enterprise firmware over the air so IT departments can manage a fleet of devices via a central OS.
  • Security updates for 4 years after market availability.
  • Product lifecycle management for two years.
  • A network of partners to tailor experiences for employees.


And the specifications:

Processor: Qualcomm Snapdragon 845 octa-core (Exynos outside the US)

Display: 5.8 inch 2960x1440 pixels resolution Super AMOLED

Operating system: Android 8 Oreo

RAM: 4GB

Storage: 64/128/256GB internal with microSD card slot

Cameras: Super speed dual pixel 12 megapixel OIS and mechanical dual aperture (f/1.5 and f/2.4). Front 8-megapixel camera, f/1.7.

Battery: 3000 mAh with fast wired and wireless charging

Water resistance: IP68

Wireless connectivity: 802.11 a/b/g/n/ac WiFi, NFC, Bluetooth 5, ANT+, NFC, GPS, Glonass, Galileo, BeiDou, MST

Audio: Dual stereo speakers tuned by AKG with Dolby Atmos, 3.5mm headset jack

Sensors: Iris, pressure, Accelerometer, barometer, fingerprint, gyro, magnet, heart rate, proximity, RGB light

Dimensions: 147.7 x 68.7 x 8.5 mm and 163 grams

IS DEX PAD A DIFFERENTIATOR?

One thing the Samsung Galaxy S9 and Galaxy S9 Plus bring to the table is the DeX dock and a desktop experience. Samsung outlined its DeX Pad, which brings a cleaner design to what essentially can be a thin client for enterprises. The core concept here is that you can bring your device, dock it and use all the ports--two USB Type-A ports, an HDMI port, and a USB Type-C port as well as Ethernet--to replicate a desktop. DeX is a system that's not available for other devices and Samsung is sticking with the approach. If this desktop experience and smartphone docking capability is a win for you, then DeX may sway you to the Galaxy S9 and S9 Plus.

Even if DeX isn't a winner yet, Samsung is showing that it will stick with the platform. That doggedness may mean DeX ultimately succeeds.



YOUR PERSONAL BUYING STRATEGY WITH ANDROID DEVICES?

Samsung is obviously in the premium device arena, but lower-end phones are catching up in features, cameras and relying on Google to improve the software experience. Do you need Galaxy S9 and Galaxy S9 Plus or a $399 phone from the likes of Huawei or Nokia? Will the Sony Xperia smartphone work for you? The premium vs. mid-tier pricing battle is only going to continue. We're in the age of smartphone innovation stagnation. Samsung is feeling it to some degree and Apple is too. The real pricing battleground will be in the Android ecosystem.