Updated on July 3, 2017 by Mike Baukes
Upon the application of Chef/Puppet with a view towards the automation of system architecture, it is possible to apportion the systems environment piece by piece and start up applications in a heartbeat. This is ideally the configuration management pinnacle of achievement, encompassing a time saving mechanism, highly replicable, and with unrivaled ability to replicate.
However ingenious the creation, it is not unassailable, with inherent defects arising from the creation of code via the platforms. Upon the testing of a limited capacity system, manual testing is plausible, but as capacity increases, the efficacy of this methodology begins to wear off, as the need to test only the necessary aspects increases with subsequent releases.
Automation of the process of bringing the system into action does not imply automated testing. Errors in past automation code are more than likely to be perpetuated within the system, and with more or less the same ramifications. And what's more, they are bound to be replicated over and again.
Testing of automated configurations is necessary, as automation and testing are not interchangeable terms. There is respite, however, owing to the fact that most of the requirements in automated testing will have been carried out in the processes previously implemented. For instance, the map of requirements, as well as the highly detailed requirements for the automatic deployment will be available, and should go a long way into easing the testing process.
Looking back into the basic concepts, such as maturity models and scheming out one for the migration from a fundamentally manually managed infrastructure to an exclusively automated one, three levels are distinctly evident.
These include the following:
a) Manual deployment as well as testing: The processes are entirely manual, coupled with a bit of monitoring, which does not include configuration analysis.
b) Manual deployment and automated testing: This entails unfolding the system manually, but with the addition of automated testing of the configuration, implying the collection of information particularly before and after changes, coupled with perpetual checking and confirmation. With this level, the effects of unfurling the system in this means will be felt, but the computerized testing and verification will boost confidence that a relatively error free system will be rolled out.
c) Both Automated: This is the ultimate level, which opens up the processes to the priceless gains of automation. It entails the use of testing information to unfurl the system in an organized and computerized manner. Automated testing also ensures that the configuration remains as error-free as possible, as errors introduced as a result of interruptions will be unraveled and corrected.
The Road Less Traveled
A vast majority of developers have been seen to prefer the second level. The desire to enjoy the benefits of automated roll out has lead to a dangerous tendency to overlook the vantage position guaranteed by the automation of both processes.
The advantages of test driven development are not lost on any self respecting developer. Indeed, creation of a benchmark for performance hitherto functionality leads to far more superior systems. This is as a result of the following:
a) It highlights desired keenness on desired outcome: the methodology emphasizes the concept of required functions and how well they are carried out.
b) A test functions in perpetuity, and is the ideal stop loss against bugs and defects arising from subsequent tweaks and overhauls. Valid tests have far reaching impacts, which extend past initial rollout of the system, with sustained applicability during changes and modifications.
Those supporting the concept of Infrastructure as Code stand to miss out on the above benefits if they ignore this vital step. An erosion of focus with regard to desired configuration state will more than often arise, and as a result will be exposed to future errors, with no safeguards whatsoever. Departure from a configuration oriented bearing will definitely leave the system in a state of susceptibility to the existence and effects of errors which have negative impacts on the system operations.
The Common Developer's Perspective
They may be terribly tedious, but the value they bring to the fore cannot be ignored. The impact of testing on the system's fortitude is indeed unequaled, as it provides fundamentally the only way to measure this aspect in a rather controlled environment. In reiteration, automation and quality are not hand in hand. The only sure way to approach flawlessness in the system is by a battery of tests, a process which is best done in automation.
Access to these tools may be partially blamed for their relatively lower appeal. Moreover, the inherent uniqueness of systems demands that unique tests be conducted, owing to the shifts and departures from a conventional configuration mode. With this in mind, tests are least likely to be replicable across systems, posing a huge challenge to developers and coders seeking to fortify their creations. This is on the back of the acknowledged level of tedium associated with coding of scripts.
Drawbacks to Automation
Indeed, the benefits derived from fully fledged computer backed testing are substantial. A flaw exists in that if the automation script contains the slightest of deviations, these will beyond any reasonable doubt be extended to the process. For the untrained developer, the system may be rampant in its inherent defects in means beyond their comprehension, owing to the inability to point out any malfunctions in the testing process. The skilled manual developer however may apply their knowledge in picking out trouble spots in the automation script, guaranteeing the production of a fully functional flawless deployment. In this light, automation in exclusivity is not ideal; the right combination of manual and automated resources is bound to be of immense benefits. A blend of these two worlds is guaranteed to be successful, owing to the fact that it draws the best from each realm.
Automation is a darling concept, and has been employed across various fronts, with rampant success. The consistencies and efficiencies achievable therein have been enjoyed in all sectors of life. Ensuring that the concept is implemented seamlessly will therefore guarantee that the quality of the products created there after surpass benchmarks of quality
Misconfigurations are an internal problem that emanate from within the IT infrastructure of any enterprise; no hacker is necessary for massive damage to occur to digital systems and stored data. And the problem is pervasive, with Gartner estimating anywhere from 70% to 99% of data breaches result not from external, concerted attacks, but from internal misconfiguration of the affected IT systems.