Many companies have the misperception that an app is part of any attempt to enter the mobile space.
By Mario Matthee, Chief Operating Officer of the DVT Global Testing Centre.
The mobile software testing market is experiencing somewhat of a Jekyll and Hyde complex at the moment.
On the one hand, mobile device growth is continuing apace, with predictions that mobile users will overtake PC and laptop users for most business and web-related tasks having long since been realised. On the other, the app boom seems to be well and truly over, with recent figures in the US showing users are downloading an average of zero apps per month.
This isn’t difficult to explain, it simply means the mobile market, which began its upward trajectory more than eight years ago, has reached saturation, and most users are now familiar with the apps they need and use every day. There are some exceptions – like Snapchat and Uber – that defy the trend and are still growing at a phenomenal rate, but unless you’re very good or very lucky, it’s going to be difficult to get your new app noticed and downloaded among the crowd.
How does this affect mobile testing, you ask? In two fundamental ways. First, the drop-off in new app development means companies have a decision to make when it comes to reaching out to their customers through mobile platforms. Apps are no longer the first step in creating a mobile presence; for many companies a responsive mobi (mobile-oriented) site makes more sense.
Secondly, device selection is vital, and increasingly so. The rapid growth and maturity of mobile devices in general and smartphones, in particular, has seen the market ultimately settle on two major platforms – iOS and Android. Some smaller platforms such as Windows Mobile and Blackberry are shrinking and even (in the case of Blackberry) migrating their users to various flavours of Android.
Because of this polarity, and the loyalty of most users to one platform or another, developing and testing native apps is not always the smart choice, especially for newcomers to the mobile space. But if testing on a limited number of devices is counter-intuitive, testing on a very large number of devices is often prohibitively expensive.
And so the starting point for any conversation on mobility and mobile testing should always be a company’s digital strategy. Unfortunately, most companies I speak with today don’t have a fully formed digital strategy, and those that do are half-cooked or based on the perception that an app is part and parcel of any attempt to enter the mobile space. The truth is that a well-built, responsive and intuitive mobile website is almost as important – if not more so – and can also perform most if not all the functions of an app, depending on the type of business it’s used for.
Deciding between apps, websites, or a combination of the two is just one of the challenges. A comprehensive digital strategy also needs to cover factors like device management, device types, usability testing and automation.
From a testing perspective, device management is critical because mobile devices are susceptible to damage, loss and theft more frequently than almost any other device type. It may seem inconsequential, but given the high cost of devices and the risk of valuable IP taking a walk at a critical development stage is very real.
The issue of device types is important regardless of the software you’re testing, be it an app, a mobile site or a desktop site on mobile devices. Even a closed platform like iOS comes with the challenge of users with previous generations of iPhone and iPad, and multiple iterations of previous generations as well.
A modern, responsive mobi site or app might light up the screen of the latest iPhone, but could bring previous legacy iPhones with older versions of iOS to a standstill. And iOS is fairly straightforward compared to the permutations of the hundreds or thousands of Android devices from dozens of different manufacturers on the market today.
Once device management and types have been narrowed down to a manageable grouping, usability testing on these devices is a third major consideration. This is where mobility testing also differs the most from other forms of software testing, because it’s necessarily hands-on. Yes, developers can test their mobile apps or websites on simulators and the odd device, but neither of these options are anywhere close to sufficient for a proper functional test of a new (or new version) of the software.
It’s almost impossible to remotely test mobile software, not because the technology is lacking, but because usability is such a big factor in the success or otherwise of a mobile app or website. And when it comes to usability, that means testing by experienced human operators, not machines.
Which brings me to the last point, automation. Even if it were practical to automate some parts of the mobile testing process, the rapid rate of change in both devices and apps (and websites) means by the time you’ve invested in solid testing scripts for your software, a new version rolls out, your users have upgraded to new devices, and a new OS has been released. That’s not to say automation won’t play an important role in your mobility strategy, but you’ll probably find manual testing plays a much bigger one.
By now you’re probably getting the sense that jumping into the mobile space – or growing your current mobile presence – is a much bigger ask than you thought, and you’d be right. Navigating the mobile testing minefield can be a nightmare if you don’t have a solid, thought-out digital strategy that informs every decision you make based on the value of the investment to the business.
A good place to start would be finding a likeminded partner with the experience to guide you through the creation or refinement of your digital strategy, before giving you access to the resources you’ll need to make it the success you need it to be.
This article was published exclusively for ITWeb on 13 September 2016.
Designated as the ‘automation specialist of choice’ by Old Mutual S.A., veteran software testing group DVT is setting its sights on the UK, eager to help new clients in a post‑Brexit world.
Now that 2017 is fully underway, Editor of TEST Magazine Cecilia Rehn caught up with Chris Wilkins, CEO, DTH and Bruce Zaayman, Director: DVT United Kingdom, to discuss how this South African powerhouse is poised to help UK businesses optimise automation this year.

CEO of Dynamic Technology Holdings

Director: DVT United Kingdom
DVT is well known as one of the largest, privately‑owned software testing groups in the southern hemisphere, but can you give us an introduction for our European audience?
Chris Wilkins: DVT started in Cape Town in 1999 and we have built up our group to a staff of 600 professional software developers, testers, business analysts, project managers, architects.
At heart we are a software development company and over the last 10 years we’ve recognised that software testing is becoming more and more important, so we built up a very large and very competent testing team. This is made up of 200 - 250 testing professionals, which includes our Global Test Centre facility in Cape Town, one of the largest, specialised testing facilities in the southern hemisphere.
Our clientele spans from large finance and insurance firms and media companies, down to smaller organisations such as Doddle.
Our focus in testing is automation; we believe that the world will slowly move towards automation, and we believe that outsourcing software testing and commoditising it, and making life easier and allowing enterprises to focus on the more specialised, and possibly more interesting, QA jobs is the way to go.
What are the main services that DVT provide?
Bruce Zaayman: We provide agile software development, testing, consulting and training.
We have also built our own test automation framework, which means our clients don't have to pay any license fees for their testing projects. The main reason we developed the java‑flavour, UTA‑H (Unified Test Automation – Hybrid) framework is because a lot of companies don’t want to spend the money on the big players. You know, the HP titles or the CA type tools. For that reason, this is based on Selenium web driver, saving costs as we’re not limited by a license for one individual machine.
If we need to run through a massive amount of work in a short amount of time we spin up a couple of VMs and we can run on double, triple, the amount of machines in order to reduce the time. So that’s a major selling point and I think that that’s something our clients look for.
CW: Everybody wants flexibility; everybody wants scalability. We, as a company, are pragmatic delivery specialists; we’re not trying to play in that big generic space. We’re not looking for these massive deals; we’re just saying ‘we can get the job done for you.’ The framework’s been built with that in mind, to get the job done, and it’s 80/20. Once the process is more or less 80% complete, the learning curve has been so dramatic that it makes that last, more challenging 20%, that much quicker and easier.
BZ: We also use other tools for automation frameworks. We are agnostic, if a client has a tool, then we are more than happy to augment that team with our service offerings and our skills. To us, an automation specialist is not just a functional software tester with some tech background; we have a java‑development type of resource that is useful. We run test automation from a development point of view, and find that this flexible and scalable approach works very well.
What can a South African venture offer to the UK/European market?
CW: Post‑Brexit, we think Britain is looking at being more of a global citizen again, and we believe South Africa is a culturally and economically sound partner.
In terms of IT outsourcing, we believe the Indian model, although effective for some companies, is not specialised, nor boutique enough for most. And when you consider the euro’s recent increase, other Eastern European options have become more costly. In contrast, the South African rand is extremely competitive, which means that there is a strong case to say that partnering with a Cape Town‑based firm can be a cost reduction and cost mitigation, strategic exercise as well.
However, we consider our strengths to be based on more than economics. When it comes to cultural familiarity, there’s a strong link between Britain and South Africa. We’re part of the same Commonwealth, share a common language, and the same time zone so you can pick up the phone and talk to someone straight away. A lot of Brits travel to and from South Africa, and a lot of them have families there as well. So there’s a strong sense of it being part of the British framework.
And of course there are loads of South Africans working in London and in the UK. These cultural links are so important for IT outsourcing in particular, when miscommunication could have huge ramifications for a project. South Africans' first language is English, they are educated in a system that reflects the British educational system and our best practice, the way we do things, the way we work, the methodologies, the jargon, they are all exactly the same.
On the whole I think communication is as easy as it can get. We are a much easier country to work with than any of the other primary sources of offshore work at the moment in Eastern Europe and India.
A key part of DVT’s business is your Global Testing Centre. How does this support your offerings and clients?
CW: The Global Testing Centre is a natural extension of our testing service. It’s all about having your testing carried out remotely, so you don’t have to hold onto the headache of staff, you don’t have to manage your peaks and troughs as large projects come and go in quick succession. Our clients don’t have to worry about finding very specialised skills for 10 hours a month; we’ll find them internally.
So the logistical benefits are enormous, it just takes away the nuisance.
We will also make sure that the bridge between the clients and the test centre is built and that it is maintained, and that there is just the right flow of communication that goes on between them. Every client is on a different maturity curve when it comes to software testing, and we provide a tailored, bespoke service.
Because DVT’s focus and expertise has been on automation, we can consult and advise on how to tackle the more emotional aspects of automation with your staff, how to take them down that road, how to get them onto that first rung of the ladder, and then how to continually invest so that over time your automation gets faster and faster.
We ensure clients can get product to market faster, and most importantly that no one is holding up all of the expensive software developers who keep waiting for testing to finish.
Our global test centre can facilitate all of that.
BZ: The GTC is structured into pods of 30‑50 odd people, run by senior technical managers. This structure ensures that there’s always senior technical knowledge onsite, in close contact. All resources allocated to clients have senior oversight. South Africans, in general, are very positive to working with international clients and forging global business links. So we ensure we have talented staff onsite with the technical knowhow to support clients, and the enthusiasm to go above and beyond.
CW: Enterprise firms like the GTC because we have the size and the scale of a larger organisation structure and start‑ups like us because we have agility in that centre and we can move around quickly. Also, the really good news is that we always have 10 to 15 people available at short notice. We would encourage any new client to work with us on an initial proof of concept, which we can often turn around in a couple of days or weeks. This would be an investment by DVT into a client, to demonstrate the way we work, the kind of experience they might get if they signed us up as a more strategic partner.
You’ve recently partnered with British TSG, what does this partnership look like?
CW: We were initially introduced through mutual acquaintances 18 months ago. This partnership makes sense: TSG went through an MBO last year, so with new ownership and invigorated management, they are tackling the market with fresh eyes. They’re British owned, British managed with blue chip clients.
As specialists in the UK market and with high‑end consultative skills, TSG really complements our proposition as an outsourcing destination. Partnering with TSG allows us close proximity with the client and senior boots on the ground, whilst we give TSG scale, flexibility, and dynamism, all in the same language and same time zone.
Working closely together from TSG’s City offices, we serve as the preferred offshore partner. I think every British software vendor or testing specialist needs this flexibility for their clients. To stay competitive, it’s an absolute necessity.
TSG is our partner of choice. We don’t want to have to have a shotgun approach to partnerships. We’d rather have just one very good partner, and of course, we want to accelerate this business together now and win new UK clients.
What are your thoughts on trends in outsourcing for 2017 and beyond?
CW: It is clear that organisations will need to invest in various different avenues to tackle testing challenges, including cost‑effective outsourced partners and a serious focus on automation.
What it boils down to is that we need less and less actual people to do more and more testing work. With the legacy that surrounds an enterprise today, there’s an enormous amount of software, lines of code. I think automation is critical otherwise costs, time and effort will balloon out of proportion. You will not be able to keep up with more agile competition.
We’re offering a specialised solution; we’re not trying to do mass‑produced stuff. South African outsourced staff have opinions, they’re not just going to sit and do as they’re told and say ‘yes’. They will question and talk. So I think we could be a very refreshing option for people wanting to outsource.
We’re a good company to work with if you want to slowly, first of all, outsource and have a manual oriented approach, and then transition across into a more automated environment. So, we tick the boxes on both sides, and over the 10 years of developing our QA competency, obviously being software development specialists we’ve introduced all the learnings, and techniques, and best practice. But not best practice in just a global generic sense, but best practices in the way that we feel what is the right way to test software.
Another key concern for organisations is how to cope when you need some specialised skill or opinion or consulting for a just a few hours a week or a month? If you’re not outsourcing, you’ve got to go find that skill somewhere if you don’t have it in‑house. We’ve got a big test team, an extended team through the company as well, so we’re more than likely to find it internally. Increasingly, organisations are finding out that this can be a big advantage. Immediate access to specialised knowledge and insight can clear log jams very quickly.
You’ve been in the industry for a long time, how do you think testing and QA is changing?
CW: I don’t think it’s changing fast enough. I think the extraordinary high amount of software code out there means that actually regression testing is, or should be, one of the primary focuses for enterprise, not just for quality, but also for speeding up the entire delivery lifecycle.
Automation testing products are reaching a better level of maturity. We’re seeing for the first time in the last few years that these products really can do the job, which means that testing automation will start coming into its own in the next five years.
So we believe in automation and offshoring, but with a more boutique flavour; not mass production ‘throw 20 more people at the project’ ideology that’s been adopted by other jurisdictions. This is a tired tactic, and we need a sharper, more adaptable approach now. And of course Brexit is going to introduce its own peak of regression testing where small code changes are going to have to be made to accommodate compliance for whatever Brexit regulations are agreed upon.
So, where is it going? There’s more formality around it, I think everyone agrees that getting an expensive java developer to test is crazy. You actually need to make sure you have a separate team with a separate responsibility with people who are trained to test not to code. There’s going to be some type of tension in post‑collaboration between those two teams. Software developers also write their own codes, so they’re more inclined to test it and say it’s okay quite quickly.
And it’s not only the code; it’s the UX as well, which is also becoming more important as the end users’ expectations change.
We’re looking forward to showing the UK what we’ve got, and helping this market navigate post‑Brexit uncertainty with a strong, neighbourly partner!
For more information about DVT please visit: www.dvt.co.za
Source: This article was published in the March 2017 edition of TEST Magazine.
By Jacqueline Metrowich
People often ask which Scrum Master certification they should do. There are two internationally recognised certifications, and both of them are endorsed by the founders of Scrum and authors of the Scrum Guide, Ken Schwaber and Jeff Sutherland.
There is no right or wrong when choosing between Certified ScrumMaster (CSM) and Professional Scrum Master (PSM). Both certifications are equally good to have on your CV as evidence of your understanding of Scrum, and you would have to weigh up the pros and cons of each and which is most suitable for you. For example: in a two day CSM class you get practical experience, networking opportunities with other Agilists and exposure to an international subject matter expert, whereas the PSM course is a quick, although not necessarily easy, and cost effective way of getting the desired certification.
Here is some information to help you decide:
1. Certified ScrumMaster (CSM) – Scrum Alliance
To be certified you first must take a two-day course taught by a Scrum Alliance authorised trainer. The process to become a Scrum Alliance authorised trainer is lengthy and stringent to ensure a high standard of training and an accurate representation of Scrum and Agile practices and principles is provided. Instructors must verify their knowledge, experience and training ability, and their course content has to be approved as being consistent with Scrum and Agile principles. For this reason, there are not many trainers certified to train CSM, and the course is relatively expensive starting at R12,500 (excl. VAT) in South Africa. However, you can be assured that a course from a Scrum Alliance authorised trainer is the real thing.
Within 90 days of doing the course, attendees must do an online assessment, which is included in the cost of the course. The assessment consists of 35 multiple choice and true or false questions, of which 24 must be answered correctly. It takes about an hour to do and can be completed in more than one sitting. It can also be retaken once at no extra cost. There is a small fee of $25 for subsequent attempts or if done after 90 days of doing the course.
The CSM certification includes a 2-year membership with Scrum Alliance. Both the certification and Scrum Alliance membership require renewal every two years for a $100 fee.
Website - https://www.scrumalliance.org
2. Professional Scrum Master (PSM) – Scrum.org
For the Professional Scrum Master certification, it is not mandatory to take a training course if you feel you already have a high level of knowledge about Scrum from self-study and on the job experience. You can then do the online assessments to certify your knowledge and ability to apply it.
There are three levels of PSM assessments (foundation, intermediate and advanced) that are based on the Scrum Guide, which is freely available, as well as other recommended reading suggested on the Scrum.org website.
Scrum.org provides training as well. However, there are no courses available in South Africa.
The PSM I assessment consists of 80 multiple choice questions (with one or more correct answers) and true or false questions and requires an 85% pass mark. It takes about an hour and, although it is open book, it requires an understanding of the Scrum Guide in-depth to apply the content based on your experience. It is said to be harder to pass than the CSM assessment, and each payment only gives you one chance to take it, after which you have to pay again. There are free open assessments available to test your knowledge beforehand and gauge your chances of passing.
The costs of the assessments currently are:
PSM I: $150
PSM II: $250
PSM III: $500
There is no renewal cost for the PSM certifications.
Whereas you can claim PMI PDUs (Project Management Institute Professional Development Units) for CSM training, you cannot claim any for the PSM assessments.
Both Scrum Alliance and Scrum.org also provide other certifications to further your Scrum knowledge and expertise. (e.g. Certifications for Product Owners, developers and coaches.)
You may have heard of alternative Agile certifications and training and wonder what they offer and how they differ. While CSM and PSM are specific to Scrum, these options cover Agile more generically and tend to be more expensive. The primary ones that are available in South Africa are:
- PMI-ACP: Offered by the Project Management Institute, requires training, experience and passing an exam
- AgilePM: Offered by APMG International, needs training or self-study to pass an exam
- PRINCE2 Agile: Requires training and passing an exam
DVT Academy offers the Scrum Alliance Certified Scrum Master training and certification with international trainers. DVT Academy specialises in Agile training and certification, and also provides the Scrum Alliance Certified Product Owner training, a suite of ICAgile Certified courses, Kanban training as well as locally certified Agile courses. For more information on all of the courses, visit: http://www.dvt.co.za/training
IT internships are the future builders of our industry, which makes it particularly depressing that so many of them fail. That said, I have an added advantage – I work for a company that runs regular education programmes (internships and learnerships) for university students and high school learners – and this gives me insight into the failure and the success of these internships.
To better understand the how’s and why’s, we first need to understand the five building blocks of every internship, each of which can either work for or against a company and its interns.
1. The intern
Before we even get into a discussion about failed internships, consider the intern. I’ve seen so many companies take learners out of school straight into an internship, which sets both company and learner up to fail. That’s because a learner is not an intern.
An intern, by definition, is someone already qualified in his or her profession, who requires focused training to bridge the gap between academic qualification and practical experience.
A learner, on the other hand, is someone fresh out of school who needs training just to get to the level of a professional-in-training (that’s right, an intern).
Assuming you have a bona fide intern at your door, the next step is to pick the right intern for the job. As you would do with any other recruitment process, the intern needs to be someone who not only has the base knowledge and training but someone who is also suited to the requirements of the job at hand.
A free-spirited, anti-establishment maverick is not going to be comfortable spending most of his day behind a screen if that’s what the role calls for, just as much as a thoughtful introvert will not be comfortable meeting with high-level clients on her first day at work.
Get the recruitment right, and your internship is off to a good start.
2. Bridging the gap
So you have a varsity-qualified, bright eyed, bushy tailed, enthusiastic intern in your midst. Now what? He or she may have all the necessary certificates, ticked all the right boxes, but have no idea how to host a client briefing, or work in a live development environment.
As professionals, we need to give interns a foundation in their new roles, and then quickly get them up to speed with what we expect them to do as working professionals in their own right. Too many companies complicate this step by overtraining their interns. A three-month induction is about right; six months is too long. Remember you’re not dealing with learners that need training to code or test code, you’re working with interns that one day soon should be able to work for you.
3. The mentor
The bulk of an intern’s internship should be on-the-job experience and real-world projects. That’s where the mentor comes in.
It’s fair to say the mentor is the most important part of any internship. An internship will not succeed if you don’t have that crucial top layer in your company actively invested in developing your interns.
The mentor is the catalyst that turns varsity graduates with talent and potential into focused, effective professionals. Mentors are vital resources for their interns, particularly in the first three to six months where raw knowledge needs to be converted into practical experience.
You could have the most enthusiastic intern, who without a mentor invested in his success, loses his way and drops out of the programme. Dedicated mentors are even harder to find than good interns because, for them, their interns are not a side project or part-time role. It is a full-time, all-consuming job to mould interns into colleagues, and inspire them to pursue their career of choice by sharing their knowledge and passion for what they do.
4. Get real
I said it above and I’ll say it again: an intern is not there to learn how to be an intern, he’s there to learn how to be a professional. The best way to do that is by doing what professionals do – for real.
The mentor’s role is to get his interns to the point where they’re proficient enough to work unsupervised on a live project. After that, it’s up to the intern – and his team and team leaders – to go the rest of the way by showing the aptitude, attitude and application to succeed.
A good internship programme will not wrap its interns in cotton wool; if interns are expected to solve real problems for real clients, they need to be working with real problems for real clients. They won’t be fending for themselves – at least not at first – which is where a carefully structured programme eases interns into their roles.
As with any other role in any other company, interns should be picked for the roles that suit them best. They need to be exposed to every aspect of their roles, from project deadlines to irate clients, Scrum Masters, delivery managers, business managers, and the head of marketing.
You’ll know they’re ready when they start to have a real impact on the outcome of a project.
5. Show me the money
There’s a fine line between running a successful internship programme and running a production line of cheap labour. One is meaningful and constructive, the other is self-serving and destructive, for both company and intern.
Any company can hire a qualified graduate and call them an intern, but the real value of an intern is the value they add to the business while growing and maturing as a professional. An internship should never be about cost cutting or profiteering, although that’s what too many companies try to do.
As with any other industry, you’ll get superstars, and you’ll get strugglers, and both will need different levels of care and attention within the confines of a structured internship programme. In South Africa, we don’t have laws and regulations that govern the movement of IT interns as you would, for example, medical or law interns. That makes it all too attractive for companies to poach other companies’ brightest interns with the lure of more money, more responsibilities, or both.
An early exit from an internship not only damages the prospects for an intern in the long term, but it also nullifies an often-substantial investment the original company already made in them.
Even if you’ve navigated the first four hurdles of a successful internship, it’s usually this last, crucial step where so many interns – and internships – ultimately fail.
To learn more about DVT’s internship opportunities, go to: www.dvt.co.za/careers/vacancies
By Mario Matthee
For all the benefits of performance testing your software, most (rigorous) performance testing is not done on your actual production systems. This somewhat mitigates the benefits, as if you were running ‘what-if’ scenarios on a simulator rather than the real thing.
While limited performance testing can safely be attempted on production systems (and sometimes is), there are good reasons for not running extreme performance testing (stress tests) on your production systems. Where basic performance testing might put a small load on your systems, stress testing is just that – stressful – on you and your systems. It necessarily pushes your systems to the limit, hitting them from different angles and pushing obscene amounts of traffic through narrow bandwidth pipes in a concerted effort to break them.
For most organisations, it is therefore impractical at best – and negligent at worst – to run anything even resembling a stress test on a live system. Except you can, and you should.
Don’t get me wrong; I’m not advocating the risky scenarios you’re probably imagining. In fact, while most commercial performance testing tools could perceivably run on production systems, they generally can’t work through the highly encrypted and secure tunnels that organisations like banks use to protect their systems from any and all forms of malicious intrusion.
But there is a way to performance test your production systems. Not only that, there is a way to automate the performance tests on your production systems, so that as soon as performance drops below a certain level, you will know about it. I call it Automated Functional Performance Testing, which very cleverly overcomes the limitations of traditional performance testing using very mainstream automated test scripts not necessarily designed to test performance.
Let me explain, using the aforementioned banking system as an example. As a bank, you use expensive proprietary software and secure protocols to guarantee your customers’ safety and verify the integrity of your transactions. The same software prevents performance testing tools from seeing – and therefore diagnosing – any problems that may occur inside the secure code.
In other words, if something is broken while a secure transaction is taking place, and that something is slowing down your system, a performance testing tool is not going to help you troubleshoot the problem because it can’t see it, so to speak.
However, install an automated script that keeps tabs on the time it takes the system to process a transaction request – the time between the customer issuing the request (before it goes through the security tunnel) and getting a result (on the other side of the tunnel) – and suddenly you have eyes on a critical part of your system’s performance without compromising security or adding any overhead to the process.
Not only that but the automated script you would use to ‘test’ the performance of this aspect of your system will also cost significantly less than a typical performance testing tool which, as you probably know, is not cheap.
The same technique can be used in different ways on different systems to the same effect. For example, run an automated script on your web server to test the response times on your newsletter signup form, or run a test script to let you know when a page click takes longer to return a result than your SLA demands.
Performance testers establish SLAs to make sure that apps reach certain benchmarks of performance. For example, a typical might include a clause that requires the first page of search results to be returned within three seconds when performing a product search.
You can even extend the script to send alerts to any number of people responsible for maintaining your business systems, giving you an early warning system to proactively rectify any faults before they start impacting your customers’ experience.
If all of this sounds very much like the results you expect to get from performance testing, that’s because it is. No, you’re not using testing tools that have been specifically developed to isolate and remedy serious performance issues, but then you’re going one step further and working with production rather than development systems.
That said, I’m not advocating that one is necessarily better than the other. Dedicated performance testing and performance testing tools play a vital role in generating quality software from the very start of the development process right through its lifecycle. Performance testing tools are also increasingly becoming mainstream, and therefore becoming more cost effective to run.
This approach is almost like production monitoring from an infrastructure perspective without focusing on the detailed performance of the memory usage and CPUs, but rather on the customer experience from a speed/response perspective. It’s also handy for monitoring the uptime/response times of third party system the application integrates with. In most cases, if these third party systems are not available, the business transaction cannot be processed.
Automated Functional Performance Testing is not performance testing in the traditional sense, but gets you many of the advantages of performance testing with the additional advantages of live information from your live systems at a lower cost.

Group CEO of Dynamic Technologies
- post-Brexit UK Tech's new offshore partner!
In spite of the many and obvious wrongs in political and social South Africa, there are many reasons to rejoice at the big hearts and generous spirits around us.
The team at DVT SA has built a company that, considering the gross imbalances in our society, reflects an amazing disparity of culture, gender and race. It's not good enough to think about it. In South Africa, we have to act on it. Nor is it appropriate to simply ignore it. So we have to say it like it is. Moreover, it’s down to individuals and private business to make the changes that make a real difference to those that need it the most.
The action is all about providing jobs for many disadvantaged people that would normally struggle to break into the IT industry. After their first year of productive work, graduates and school leavers have at least a 50% better chance of getting their next job.
DVT has found a creative way to employ inexperienced graduates and school leavers into the technology sector. We bridge the race and gender spectrum, focusing on disadvantaged backgrounds, and bring out the best in those we employ.
As the founder of DVT who understands the enormous challenges of bringing inexperienced youngsters into the high-flying technology industry, I am extremely proud and in total awe of what the team at DVT has achieved over the past five years.
In 2016, and the first part of 2017, we introduced more than 100 new young professionals to the IT industry. This year it is set to exceed that. We don't just touch on these young lives; we touch on all of those around them.
I don't often blow our trumpet so expansively, but this is a wonderful achievement within a true African context and right where it is needed.
I challenge other Tech companies in SA to do the same.
Well done DVT.
Jargon is like lava that bubbles into the air, pushed by enormous magma movements beneath the Earth, finally breaking out volcanically and painting the terrain with its fiery, colourful brush.
Yep, jargon is just as dramatic and colourful in its choice and use of words.
The problem with jargon is that, like magma, the underlying forces are rarely well understood. Enormous stresses and strains finally push the lava onto the surface. However, the circumstances affecting the magma beneath are largely unknown and open to subjective interpretation. That's what jargon does. It gets pushed and formed from complex and volatile environments that brew for years. Change in these environments is not served best by blasé and one-line descriptions looking for one-size-fits-all answers.
When I started as a developer in the software industry, I was intimidated by those fast-talking consultants that made jargon sound important, impressive and an invaluable tool in your box of commercial tricks.
But it's not. Jargon is at best a guide, and at worst utterly confusing.
'Any intelligent fool can make something bigger, more complex, and more violent. It takes a touch of genius - and a lot of courage - to move in the other direction'. (E.F Schumacher or Albert Einstein...).
Let's use jargon sparsely and never assume that the person we are talking to should ever have the same interpretation as us, or even remotely know what we are talking about. Jargon simply serves as an introduction to a unique and special problem in each business environment. It needs to be used carefully and is most effective when couched in a question like: 'What does DevOps mean to you in your organisation?'
