Using the attached articles on information systems best practices. Discuss some of the key issues to be aware of and the best practices to mitigate them.
§ Discuss strategic analysis decisions in the next five years, what we need to watch out for in the information technology (IT) field, and how these decisions will impact the overall company.
§ Examine potential changes in IT related to innovation and organizational processes.
§ List and describe internal (online) information security risks and mitigation tactics and how they will affect decision-making strategies.
§ List and describe external (building) information security risks and mitigation tactics and how they will affect decision-making strategies.
Your scholarly activity submission must be at least three pages in length. You are required to use at least one outside source to support your explanation. All sources used must be referenced; paraphrased and quoted material must have accompanying citations and be cited per APA guidelines.
Your scholarly activity should be formatted in accordance with APA style.
Please use sources attached and one outside source.
Sample paper also attached
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. This article provides insights into the current state of developmental testing (DT) and requirements management in Department of Defense information systems employing Agile development. The authors describe the study methodology and provide an overview of Agile development and testing. Insights are described for requirements, detailed planning, test execution, and reporting. This work articulates best practices related to DT and requirements management strategies for programs employing modernized Software Development Life Cycle practices. DOI: https://doi.org/10.22594/dau.19-819.26.02 Keywords: Development and Operations (DevOps), DoD, Software Development, Automated Testing, Software Development Life Cycle (SDLC) D EF EN SE AC QUIS ITION UNIVERSITY H IR SC H RE SEA RCH COMPETITIO N 2019 A LUMNI ASSOCIA TI ON 2nd DEVELOPMENTAL TEST and REQUIREMENTS: Best Practices of Successful INFORMATION SYSTEMS USING AGILE METHODS Jeremy D. Kramer and Lt Col Torrey J. Wagner, USAF Image designed by Michael Krukowski 130 131Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Developmental Test and Requirements: Best Practices of Successful Information Systems https://www.dau.mil April 2019 Industry software development efforts have used Agile and development and operations (DevOps) methodologies over the last 5 to 15 years. The Department of Defense (DoD) has applied these methodologies to various information system acquisition programs, and current guidance provides a renewed interest in pursuing these methodologies. The National Defense Authorization Act for Fiscal Year 2018 (NDAA, 2017) directs acquisition Program Management Offices (PMO) to pursue Agile or iterative soft- ware development by establishing pilot programs to use “Agile or Iterative Development methods to tailor major software-intensive warfighting sys- tems and defense business systems” (§ 873) and “Software Development Pilot Program Using Agile Best Practices” (§ 874). Similarly, the 2018 Air Force Guidance Memorandum for Rapid Acquisition Activities (AFGM 2018-63-146-01) states that, “Agile software development and [DevOps] is required for all new initiatives unless waived” (Department of the Air Force, 2018, p. 9). Background This study was motivated by the 2018 NDAA and increased emphasis on Agile development throughout the DoD; the lead author collected data from April to September 2018 during a rotational assignment within the Office of the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation, DASD(DT&E). The purpose of the study was to collect insights and best practices from DoD program offices prac- ticing Agile software development methods, some of which use the DevOps development and deployment strategy. The five DoD software-intensive sys- tems shown below were selected for this study, as they have been infor- ma lly recognized as successful development programs for their use of Agile methodologies, integrated testing, and/or iterative develop- ment. The authors are not aware of any independent review that determines if the programs are achieving their objectives. • Joint Space Operations Center (JSpOC) Mission System (JMS), Air Force • Distributed Common Ground System–Nav y (DCGS-N) Increment 2, Navy • Global Combat Support System–Joint (GCSS-J), Defense Information Systems Agency (DISA) • Reserve Component Automation System (RCAS), Army • Catapult/ANTS (Attack the Network Tool Suite), Joint Improvised-Threat Defeat Organization (JIDO) Table 1 provides detailed characteristics of the programs that we surveyed and analyzed for their Agile development and test approaches. The pro- grams are arranged into two groups: System of Systems (SoS) and Web Applications (Web Apps). The Acquisition Category (ACAT) is either: (a) Automated Information Systems with the Defense Acquisition Executive (IAM), or (b) DoD Component head (IAC), or designated delegates as the decision authority. TABLE 1. PROGRAM INFORMATION Program Description Group Component ACAT JMS Space command and control situational awareness SoS Air Force IAM DCGS-N Increment 2 Tactical gateway for Navy-unique sensor data across the intelligence community SoS Navy IAC GCSS-J Joint military integrated logistics through applications and tools Web Apps DISA IAC RCAS Integrated, web-based software solutions to manage mobilization, safety, personnel, and force authorization requirements Web Apps Army IAM Catapult/ ANTS Catapult integrates global intelligence data; ANTS addresses improvised explosive device threats Web Apps JIDO N/A The SoS and Web Apps groupings are general in nature. The web application suites (GCSS-J, RCAS, and Catapult/ANTS) consist of one infrastructure system feeding multiple web applications, generally with loose coupling. 132 133Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Developmental Test and Requirements: Best Practices of Successful Information Systems https://www.dau.mil April 2019 Replicating the production environment in a development ecosystem is usu- ally straightforward. On the other hand, SoS (DCGS-N, JMS) are typically more complex than web application suites because they likely depend on tightly coupled, disparate systems that are sensitive to changes. Different vendors may have developed each subsystem, which increases integration challenges due to proprietary information or custom interfaces. Agile Software Development Agile Software Development Life Cycle (SDLC) methods have emerged as industry best practices. The Agile methodology emerged in 2001 when 17 leading software developers created the Agile Manifesto to design and share better ways to develop software. Agile values can be distilled into four core elements: • Focusing on small, frequent capability releases • Valuing working software over comprehensive documentation • Responding rapidly to changes in operations, technology, and budgets • Actively involving users throughout development to ensure high operational value Each of the many existing Agile methodologies (e.g., Scrum, Extreme Programming [XP], Kanban) has its own unique processes, terms, techniques, and timelines. DevOps Software Development DevOps SDLC methodologies build upon Agile methodologies by addressing information technology (IT) operations team deployment activ- ities, resulting in faster deployment to the production environment—daily or weekly instead of monthly or semiannually. DevOps works best when the system is in sustainment, when enhancements are smaller, and when a mature user base is established. Increased emphasis is placed on shifting sequential processes into parallel processes and deploying capabilities to the user faster, which entails (Cagle, Kristan, & Rice, 2015): • Strong use of automation (code, build, infrastructure/archi- tecture, test) • Continuous development, test, integration, and demonstration of the software throughout the life cycle • A high level of collaboration and integration between develop- ment, quality assurance, and IT operations • Static, dynamic, and fuzz testing techniques • Open source and reusable components, services, and develop- ment tools A variant of DevOps is Secure DevOps, which emphasizes security through- out the entire process. This variant is known as SecDevOps and DevSecOps, and incorporates automated security testing, monitoring, and securely handling deployment, typically through hashed Docker containers. Method A team of DASD(DT&E) staff specialists developed a 38-question sur- vey for PMO personnel to complete between June and August 2018, which contained questions in these areas, with specific survey questions available upon request: • Agile development, specifically method, management tools, and structure of teams • Requirements tool usage • Documents used during actual development 134 135Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Developmental Test and Requirements: Best Practices of Successful Information Systems https://www.dau.mil April 2019 • Test planning, test events, and test implementer (developer, DT&E, operational test and evaluation [OT&E]) • Greatest successes and challenges to DT&E program, and future goals • Lessons learned and best practices in the areas of: ° Requirements ° Detailed planning ° Test events and execution Additionally, the team interviewed the 45th and 47th Test Squadrons (TS) at Eglin Air Force Base (AFB), FL, to learn more about their Agile developmental testing (DT) efforts. The 45th TS is the principal Air Force organization for strategic air and space, weapon platform, and business systems software DT&E. The 47th TS is the principal Air Force organization that evaluates cyber resiliency of both information- and avionics-based systems. Both belong to the 96th Cyberspace Test Group (F. B. Chavers, personal communication, June 13, 2018). Analysis The following sections compare and contrast the detail between the software development methods, requirements, detailed planning, and test events of the five programs studied. Emphasis is placed on automated test- ing, sprint-level testing, and user acceptance testing. SDLC Method Table 2 identifies each program’s SDLC method, Agile type, tools, and method of defining requirements. TABLE 2. PROGRAM SDLC Program SDLC Method Agile Type Tools Requirements Definition JMS Scrum/Waterfall Hybrid Jira, self-developed User Story DCGS-N Increment 2 Scaled Agile Framework (SAFe) Hybrid Jira User Story GCSS-J Scrum Agile Rally, Jira User Story RCAS SAFe, Secure DevOps Hybrid VersionOne User Story Catapult/ ANTS Secure DevOps Hybrid Jira User Story The programs use industry standard Agile SDLC tools, as referenced in the 12th Annual State of Agile Report, which states that 58% of self-reported agencies use Jira, 20% use VersionOne, 9% use CA Agile Central (Rally), and 46% use Excel (respondents could select multiple items) (CollabNet VersionOne, 2018). Within the report, “Respondents were asked whether they would recommend the tool(s) they are using based on their experience,” (p. 15) with 81% recommending VersionOne, 77% recommending Jira, and 72% recommending CA Agile Central (Rally), the top three tools. Within the GCSS-J program, software developers use Rally for SDLC, and the testers use Jira for user stories and wireframes (detailed diagrams) to help design test cases. Jira provides issue/bug tracking and workflow management. JMS uses Jira and a self-developed tool for internal tracking of requirements, which are user stories, workflows, and requirements documents. The user stories are derived from workflows or checklists from current operations. Test cases are built over a compilation of user stories that sum to a specific functionality. 136 137Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Defense ARJ, April 2019, Vol. 26 No. 2 : 128-150 Developmental Test and Requirements: Best Practices of Successful Information Systems https://www.dau.mil April 2019 VersionOne, used by RCAS, has a built-in requirements traceability matrix from capability to feature, to story, to task, to test case, which is extracted and delivered as a part of each release