The key to mainframe DevOps success is in quickly identifying and removing major bottlenecks in the application delivery lifecycle. Major challenges include collaboration between mainframe and distributed teams, lack of visibility into the impact of software changes, and limited resource flexibility with scaling out necessary testing initiatives. Now let's take a closer look at some of these key challenges and how IT departments can address them ...
Choosing the right smartphones and tablets to test your apps against is just as important as the tests themselves. After all, if your app isn’t working at the right place (your customer’s device) at the right time (all the time), angry customer tweets and two-star app reviews will follow and the consequences for your business will be severe.
In some cases, your app won’t even make it into the app store. Twenty percent of apps are rejected from the Apple store because they’re buggy, crash-prone or have an inferior user interface. And when a new version of Android arrives, apps that worked fine on the previous OS version start to act like rejects themselves.
On the hardware side of the fence, new devices arrive every quarter, and device usage varies by geography. Windows Phones may have 3.8% market share in the US but in Europe it has an average market share of close to 9%, according to mobile market research firm, ComTech. Basing your testing of Windows Phones solely on US market data when your app also has a presence in, say, France (12%) and Italy (13.3%), would be a costly mistake.
The bottom line is that mobile devices and operating systems are always in flux. You need to keep a close eye on market changes so you’ll be testing your apps and responsive websites against the devices that represent your target users. To that end, we created a Mobile Test Coverage Index report, a quarterly release that aggregates data from over 4,000 device profiles and 360,000 hours of customer usage data, as well as mobile market share data and future device release timelines, to create a list of the 32 most relevant mobile devices.
The device selections address five key sectors: device model, screen size, GA date, hardware system and recommended operating systems. The number of devices in each test bracket may vary from quarter to quarter to reflect the restless mobile market. The report will be continuously updated to meet market changes.
While the report will not hand-deliver an answer for what devices and operating systems the company should specifically test against, it can be used as a benchmark from which it can customize their test coverage according to their own business and customer needs.
Some noteworthy insights from the research:
■ Almost a third (30 percent) of the market can be covered with 10-16 devices – including popular devices such as the Samsung Galaxy S5, iPhone 6 and HTC One M8 – based on market adoption, market device leaders, reference devices and device characteristics.
■ Regarding operating systems: iOS 7 still accounts for some 17 percent of total users out there, and should be included at the same level of iOS 8 test plans. Additionally, Android 5.0 adoption rates are low in both the US and Europe with Android KitKat (4.4.x) and JellyBean (4.2.x, 4.3.x) rating as the most commonly used versions.
■ Geographically speaking, iPhones hold twice as much share in the US than in Europe, with iPhone devices capturing 35% of the U.S. market, as opposed to 16% in EU5.
We can’t overstate how important device coverage is to a mobile app testing strategy, for both business and technical executives alike. A relevant device and operating system mix will get an organization much closer to the true end-user experience. However, it’s a challenge to keep your test coverage relevant given that mobile devices have a shelf life of around nine months. Benchmarks built from data like this report will help, but in an ever-changing market it’s essential to keep reviewing continually updated research.