Complete Book Listing

   

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165


Software Engineering

Evidence-Based Software Engineering and Systematic Reviews - Kitchenham, Budgen and Brereton

Author

Barbara Ann Kitchenham
David Budgen
Pearl Brereton

Cover Price : £ 99.99

Imprint : CRC Press
ISBN : 9781482228656
YOP : 2016

Binding : Hardback
Size : 6.25" X 9.50"
Total Pages : 433
CD : No

About the Book :- Features Provides a clear introduction to the use of an evidence-based model for software engineering research and practice Explains the roles of primary studies (experiments, surveys, case studies) as elements of an over-arching evidence model, rather than as disjointed elements in the empirical spectrum Presents well-established and up-to-date guidelines for conducting secondary studies in software engineering Supplies guidance on how an evidence-based approach can influence teaching, practice, and research Summary In the decade since the idea of adapting the evidence-based paradigm for software engineering was first proposed, it has become a major tool of empirical software engineering. Evidence-Based Software Engineering and Systematic Reviews provides a clear introduction to the use of an evidence-based model for software engineering research and practice. The book explains the roles of primary studies (experiments, surveys, case studies) as elements of an over-arching evidence model, rather than as disjointed elements in the empirical spectrum. Supplying readers with a clear understanding of empirical software engineering best practices, it provides up-to-date guidance on how to conduct secondary studies in software engineering—replacing the existing 2004 and 2007 technical reports. The book is divided into three parts. The first part discusses the nature of evidence and the evidence-based practices centered on a systematic review, both in general and as applying to software engineering. The second part examines the different elements that provide inputs to a systematic review (usually considered as forming a secondary study), especially the main forms of primary empirical study currently used in software engineering. The final part provides practical guidance on how to conduct systematic reviews (the guidelines), drawing together accumulated experiences to guide researchers and students in planning and conducting their own studies. The book includes an extensive glossary and an appendix that provides a catalogue of reviews that may be useful for practice and teaching. Contents :- PART I EVIDENCE-BASED PRACTICES IN SOFTWARE ENGINEERING The Evidence-Based Paradigm What do we mean by evidence? Emergence of the evidence-based movement The systematic review Some limitations of an evidence-based view of the world Evidence-Based Software Engineering (EBSE) Empirical knowledge before EBSE From opinion to evidence Organizing evidence-based software engineering practices Software engineering characteristics Limitations of evidence-based practices in software engineering Using Systematic Reviews in Software Engineering Systematic reviews Mapping studies Meta-analysis Planning a Systematic Review Establishing the need for a review Managing the review project Specifying the research questions Developing the protocol Validating the protocol Searching for Primary Studies Completeness Validating the search strategy Methods of searching Examples of search strategies Study Selection Selection criteria Selection process The relationship between papers and studies Examples of selection criteria and process Assessing Study Quality Why assess quality? Quality assessment criteria Procedures for assessing quality Examples of quality assessment criteria and procedures Extracting Study Data Overview of data extraction Examples of extracted data and extraction procedures Mapping Study Analysis Analysis of publication details Classification analysis Automated content analysis Clusters, gaps, and models Qualitative Synthesis Qualitative synthesis in software engineering research Qualitative analysis terminology and concepts Using qualitative synthesis methods in software engineering systematic reviews Description of qualitative synthesis methods General problems with qualitative meta-synthesis Meta-Analysis With Lech Madeyski Meta-analysis example Effect sizes Conversion between different effect sizes Meta-analysis methods Heterogeneity Moderator analysis Additional analyses Reporting a Systematic Review Planning reports Writing reports Validating reports Tool Support for Systematic Reviews With Christopher Marshall Review tools in other disciplines Tools for software engineering reviews Evidence to Practice: Knowledge Translation and Diffusion What is knowledge translation? Knowledge translation in the context of software engineering Examples of knowledge translation in software engineering Diffusion of software engineering knowledge Systematic reviews for software engineering education Further Reading for Part I Part II THE SYSTEMATIC REVIEWER’S PERSPECTIVE OF PRIMARY STUDIES Primary Studies and Their Role in EBSE Some characteristics of primary studies Forms of primary study used in software engineering Ethical issues Reporting primary studies Replicated studies Further reading Controlled Experiments and Quasi-Experiments Characteristics of controlled experiments and quasi-experiments Conducting experiments and quasi-experiments Research questions that can be answered by using experiments and quasi-experiments Examples from the software engineering literature Reporting experiments and quasi-experiments Further reading Surveys Characteristics of surveys Conducting surveys Research questions that can be answered by using surveys Examples of surveys from the software engineering literature Reporting surveys Further reading Case Studies Characteristics of case studies Conducting case study research Research questions that can be answered by using case studies Example of a case study from the software engineering literature Reporting case studies Further reading Qualitative Studies Characteristics of a qualitative study Conducting qualitative research Research questions that can be answered using qualitative studies Examples of qualitative studies in software engineering Reporting qualitative studies Further reading Data Mining Studies Characteristics of data mining studies Conducting data mining research in software engineering Research questions that can be answered by data mining Examples of data mining studies Problems with data mining studies in software engineering Reporting data mining studies Further reading Replicated and Distributed Studies What is a replication study? Replications in software engineering Including replications in systematic reviews Distributed studies Further reading PART III GUIDELINES FOR SYSTEMATIC REVIEWS Systematic Review and Mapping Study Procedures Introduction Preliminaries Review management Planning a systematic review The search process Primary study selection process Validating the search and selection process Quality assessment Data extraction Data aggregation and synthesis Reporting the systematic review A Catalogue of Systematic Reviews Relevant to Education and Practice With Sarah Drummond and Nikki Williams Professional Practice (PRF) Modelling and Analysis (MAA) Software Design (DES) Validation and Verification (VAV) Software Evolution (EVO) Software Process (PRO) Software Quality (QUA) Software Management (MGT) Bibliography Index

Essential Software Testing : A Use-Case Approach - Greg Fournier

Author

Greg Fournier

Cover Price : £ 66.99

Imprint : T & F / Routledge
ISBN : 1420089813
YOP : 2009

Binding : Paperback
Total Pages : 280
CD : No

About the Book :- Much has been written about the difficulty of software testing. Often these laments are accompanied by cautionary words about how careful one has to be to ensure testing is done properly. However, there is a dearth of resources that give practical guidance on the nuts and bolts of testing. Essential Software Testing: A Use-Case Approach describes testing methods and techniques in a common sense manner that is easy to understand, helping readers to quickly and effectively implement project-specific testing solutions. Divided into three parts, the book first discusses ways to make testing agile, providing insight into how testing can be done efficiently in different process environments. Next, the book supplies an overview of testing concepts. Lastly, it demonstrates how to perform the actual test, detailing specific testing activities that can be used on almost any project, with specific attention given to use-case driven testing. It describes how to test using Use Cases regardless of the specific requirements of the project. The author weaves helpful war stories throughout the text, placing the concepts in a concrete framework. This guide gives software testers a firm grasp of all testing fundamentals: how to determine what to test and how to test it, how to select proper tests to match the plan, techniques to build and trace tests, and finally, how to conduct and record tests. Contents :- Dedication Preface Acknowledgments Part One : Testing Essentially Chapter 1. On Being A Tester Chapter 2. Basic Concepts Boot Camp Chapter 3. Examples From My Experience We’ll Work With Chapter 4. What is Essential Testing? Chapter 5. Essential and Efficient Testing Chapter 6. Being Essentially Agile Chapter 7. Build Testing Agility Into Any Project Part Two : Fundamentals For Testing Success Chapter 8. Requirements – Fundamentals For Testing Success Chapter 9. Use Cases For Testers Chapter 10. Building A Test Process That Fits Part Three : The Successful Testing Process Chapter 11. Essential Test Planning Chapter 12. Grouping Requirements With Use Cases Chapter 13. Extending Use Cases For Testing Chapter 14. Identifying Tests Chapter 15. Essential Test Cases Chapter 16. Adding Test Design To Your Test Case Chapter 17. Creating Tests Chapter 18. Executing Tests Chapter 19. Essential Traceability Chapter 20. It All Comes Together Like This Chapter 21. Conclusion Appendix A. Appendix B. Examples Appendix C. Templates Index. About the Author :- Greg Fournier is the principle and founder of Enlighten Solutions, where he is involved in consulting, mentoring, and instructing companies in all aspects of software development. With over 20 years experience, he has worked on projects in different capacities, including Project Management, Business Process Analysis, Requirements Analysis, System Architecture and Design, and Testing. Greg has spent much of the last decade teaching, mentoring, and consulting in many areas of software development and process with companies and organizations such as Wachovia, Siemens AG, the United States Military, Northrop Grumman, Qwest Communication, Schering Plough, General Electric, and Blue Cross Blue Shield of North Carolina.


   

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165