Cuauhtémoc López-Martín, Ali Bou Nassif, Alain Abran, A training process for improving the quality of software projects developed by a practitioner, Journal of Systems and Software, Volume 131, September 2017, Pages 98-111, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.05.050. (https://www.sciencedirect.com/science/article/pii/S0164121217300973) Abstract: AbstractBackground The quality of a software product depends on the quality of the software process followed in developing the product. Therefore, many higher education institutions (HEI) and software organizations have implemented software process improvement (SPI) training courses to improve the software quality. Objective Because the duration of a course is a concern for HEI and software organizations, we investigate whether the quality of software projects will be improved by reorganizing the activities of the ten assignments of the original personal software process (PSP) course into a modified PSP having fewer assignments (i.e., seven assignments). Method The assignments were developed by following a modified PSP with fewer assignments but including the phases, forms, standards, and logs suggested in the original PSP. The measurement of the quality of the software assignments was based on defect density. Results When the activities in the original PSP were reordered into fewer assignments, as practitioners progress through the PSP training, the defect density improved with statistical significance. Conclusions Our modified PSP could be applied in academy and industrial environments which are concerned in the sense of reducing the PSP training time. Keywords: Software engineering education and training; Software process improvement; Software quality improvement; Personal software process Dehua Ju, Beijun Shen, Internet of Knowledge Plus Knowledge Cloud–A Future Education Ecosystem, IERI Procedia, Volume 2, 2012, Pages 331-336, ISSN 2212-6678, https://doi.org/10.1016/j.ieri.2012.06.097. (https://www.sciencedirect.com/science/article/pii/S2212667812001050) Abstract: Abstract Internet of Knowledge (IoK) and Knowledge Cloud are proposed and developed in this paper to create an easily accessible education ecosystem which facilitates millions of professionals attaining to knowledge on-demand, as a auxiliary step shifting to modern education paradigm. The authors have defined and compiled dozens of prototype IoK systems, dedicating to software engineering, service engineering and advanced IT applications. Keywords: IoK; Knowledge cloud; Knowledge service; KaaS; BOK-based; Education ecosystem Carisa Bohus, Lawrence A. Crowl, Burçin Aktan, Molly H. Shor, Running Control Engineering Experiments Over the Internet, IFAC Proceedings Volumes, Volume 29, Issue 1, June–July 1996, Pages 2919-2927, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)58121-5. (https://www.sciencedirect.com/science/article/pii/S1474667017581215) Abstract: Abstract An important, issue in engineering education is the availability of laboratory resources for student use. Using a computer network to link geographically distant students with laboratory teaching resources makes expensive and innovative equipment available to more students. A working environment is now available where remotely-located students can develop and run controllers on experiments in the Oregon State University (OSU) control engineering laboratory. Remote users can watch the experiment in real time from a remote workstation, hear the sounds in the laboratory, and interact with other laboratory users. Remote power control, network reliability, and safety features are integrated into the experimental hardware and software design. Keywords: Control education; Laboratory education; Educational aids; Human-machine interface; Interconnection technology; Multimedia; Safety; User interfaces James Goedert, Yong Cho, Mahadevan Subramaniam, Haifeng Guo, Ling Xiao, A framework for Virtual Interactive Construction Education (VICE), Automation in Construction, Volume 20, Issue 1, January 2011, Pages 76-87, ISSN 0926-5805, https://doi.org/10.1016/j.autcon.2010.07.002. (https://www.sciencedirect.com/science/article/pii/S0926580510001007) Abstract: Training and process analysis in the construction industry has not taken full advantage of new technologies in simulation, modeling, and semantic web and software engineering. The purpose of this research is to develop a framework for a virtual interactive construction education system taking full advantage of these technologies. The modules will simulate the construction process for a facility from start to finish using information drawn from domain experts using real projects in the built environment. These modules can be used as training tools for new employees where they attempt to optimize time and cost in a virtual environment given a limited number of equipment, time and employee options. They can also be used as a process analysis tool before, during and after construction where a number of situational variables could be analyzed for exposure of potential risk. These modules would be particularly useful for repetitive construction where the initial project or task is analyzed for optimization and risk mitigation. This paper describes the framework using a residential construction example that is a 900 square foot (about 85 m2) wood frame single family house designed for the United States. Keywords: Virtual simulation; Games; Construction education; Training Jenny Preece, Laurie Keller, Teaching the practitioners: developing a distance learning postgraduate HCI course, Interacting with Computers, Volume 3, Issue 1, April 1991, Pages 92-118, ISSN 0953-5438, https://doi.org/10.1016/0953-5438(91)90006-N. (https://www.sciencedirect.com/science/article/pii/095354389190006N) Abstract: This paper reports on HCI education and on issues in HCI needing resolution when developing a course in human-computer interaction. We also look at how HCI can be taught, particularly to professional engineers, scientists and managers, using distance teaching and predicated on students using their industrial base as a classroom and laboratory. The paper also draws a comparison between the practices of user-centred iterative software design and the way that our course was developed. Keywords: curriculum development; multidisciplinary; theory; practice; knowledge; tools; terminology; distance learning; user-centred course development Richard E. (Dick) Fairley, The influence of COCOMO on software engineering education and training, Journal of Systems and Software, Volume 80, Issue 8, August 2007, Pages 1201-1208, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2006.09.044. (https://www.sciencedirect.com/science/article/pii/S0164121206002901) Abstract: As the discipline of software engineering has matured, COCOMO (constructive cost model) has evolved, both in response to and as a leading indicator of changes in software engineering methods and techniques. This paper traces the evolution of the COCOMO cost estimation models as they have evolved from 1981 to 2005. In particular, COCOMO 81, Ada COCOMO, and COCOMO II are presented. COCOMO has been, and continues to be a vehicle for introducing and illustrating software engineering methods and techniques. Emphasis is placed on the role COCOMO models have played, and continue to play, in software engineering education and training. Keywords: Cost estimation; COCOMO; Ada COCOMO; COCOMO II; Software engineering education and training; Teachable moments Judy M. Emms, Hugh M. Robinson, Peter G. Thomas, Aspects of teaching software engineering at a distance, Education and Computing, Volume 4, Issue 1, 1988, Pages 37-44, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(88)80024-4. (https://www.sciencedirect.com/science/article/pii/S0167928788800244) Abstract: This paper describes the teaching at a distance of some aspects of software engineering. It describes an undergraduate course in some detail, and addresses itself to three problem areas encountered in the writing of the course and explains the solutions that were adopted. Initial feedback on the course suggests that the attempt has been successful and that the approach is one that could usefully be employed elsewhere. Keywords: Teaching at a distance; Software Engineering; Formal methods; Abstract data types; Educational software; Home computing Cuauhtemoc Lopez-Martin, A fuzzy logic model for predicting the development effort of short scale programs based upon two independent variables, Applied Soft Computing, Volume 11, Issue 1, January 2011, Pages 724-732, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2009.12.034. (https://www.sciencedirect.com/science/article/pii/S1568494609002919) Abstract: Fuzzy models have been recently used for estimating the development effort of software projects and this practice could start with short scale programs. In this paper, new and changed (N&C) as well as reused code were gathered from small programs developed by 74 programmers using practices of the Personal Software Process; these data were used as input for a fuzzy model for estimating the development effort. Accuracy of this fuzzy model was compared with the accuracy of a statistical regression model. Two samples of 163 and 68 programs were used for verifying and validating respectively the models; the comparison criterion was the Mean Magnitude of Error Relative to the estimate (MMER). In verification and validation stages, fuzzy model kept a MMER lower or equal than that regression model and an accuracies comparison of the models based on ANOVA, did not show a statistically significant difference amongst their means. This result suggests that fuzzy logic could be used for predicting the effort of small programs based upon these two kinds of lines of code. Keywords: Software effort estimation; Fuzzy logic; Multiple linear regression; Personal Software Process; Software engineering education and training Ch. Bouras, P. Destounis, J. Garofalakis, A. Gkamas, G. Sakalis, E. Sakkopoulos, J. Tsaknakis, Th. Tsiatsos, Efficient web-based open and distance learning services, Telematics and Informatics, Volume 17, Issue 3, December 2000, Pages 213-237, ISSN 0736-5853, https://doi.org/10.1016/S0736-5853(00)00012-5. (https://www.sciencedirect.com/science/article/pii/S0736585300000125) Abstract: In this paper, we present data management issues faced during the design and development of an open distance learning system for the University of Patras, Greece. In order to handle data efficiently, as required in a web tele-training application, for each type of information maintained, different strategies must be deployed according to their behaviour and structure. The diversity and complexity of data, the network aspect of the application and web deficiencies impose an architecture design incorporating a plethora of technologies and tools that must be integrated in such a fashion that they efficiently organise these data preserving their relationships. This presents a software engineering challenge requiring coherence of solutions at all levels: structures, consistency, security, models, and protocols. The paper presents the data components of an open and distance learning (ODL) system that access the information stored in a database and the file system, their underlying technology, their interaction with the network services, and features regarding the ways they address issues faced in an open vendor-independent distance learning environment and outlines the system's overall architecture. In addition, this paper presents the architecture, the design and the services of a network-based information system that supports open and distance learning activities. The open and distance learning information system (ODLIS) offers synchronous and asynchronous distance learning and management of information system (MIS) services to support the educational procedure. The ODLIS is a web-based application, which runs over the Internet using real time protocols. Keywords: Web; ODL; ODL system; CSCW/L Cho-Chien Lu, Shih-Chung Kang, Shang-Hsien Hsieh, Ruei-Shiue Shiu, Improvement of a computer-based surveyor-training tool using a user-centered approach, Advanced Engineering Informatics, Volume 23, Issue 1, January 2009, Pages 81-92, ISSN 1474-0346, https://doi.org/10.1016/j.aei.2008.07.001. (https://www.sciencedirect.com/science/article/pii/S147403460800058X) Abstract: This paper presents the experiences of improving an existing surveyor-training tool, called SimuSurvey, using a user-centered approach. As few users were involved during the initial development of SimuSurvey, many instructors and students were skeptical about the innovative application of SimuSurvey in actual surveying classes. To address this problem, we proposed and applied an iterative and incremental user-centered design method to redevelop the tool. Three hundred and forty-six users including 5 instructors, 4 surveying experts, and 337 students, with different backgrounds were introduced at different stages of the redevelopment process. After two iterations of complete redevelopment cycles with five intermediate prototype systems generated, a much improved version of the tool, namely SimuSurvey R2, was developed. From the final interviews with students and the field observation on user groups, SimuSurvey R2 has been shown to be more practical for use in actual surveying classes. In addition, the proposed user-centered approach and several techniques it employs, such as storyboards and content diagrams, paper-based prototyping, high-fidelity prototyping, and usability tests, have been found to be effective for improvement (or redevelopment) of software systems. Keywords: Iterative and incremental development; User-centered design; Software engineering; Surveyor-training; Engineering education; Computer-aided instruction Jerry D. Cavin, The Role of Human Factors in Veteran SQA Training, Procedia Manufacturing, Volume 3, 2015, Pages 1535-1542, ISSN 2351-9789, https://doi.org/10.1016/j.promfg.2015.07.416. (https://www.sciencedirect.com/science/article/pii/S2351978915004175) Abstract: Abstract The difficulty of translating military experience into the civilian workforce has led to a high unemployment rate for returning veterans. In a recent collaboration between Bridge360 and Park University a Software Quality Assurance training course was developed to assist returning veterans in learning the fundamentals of Software Quality Assurance. The objective of this paper is to describe how human factors played a role in the successful implementation of the veterans SQA training course and their successful transition into the civilian workforce. The goal of the human factors methods used during the coursework was to provide a solid skill set for the veterans and the continuous improvement of the course material. This was accomplished through a series of lectures, tests, and surveys during the course. In addition to hands-on experiments, guest speakers and case studies also were shown to be important. In the case of returning veterans the application of human factors went beyond the improving the courseware and included placing the veterans in SQA positions that they were uniquely qualified. With their newly acquired abilities combined with their military skills set allowed them to make a successful transition from military service to software quality assurance interns. Their military skill set includes human factors such as the ability to work as a team member or as a team leader, ability to work under pressure to meet deadlines, and high standards of quality and a serious commitment to excellence. In conclusion, human factors played crucial role in the implementation of a successful training program in support of veterans in returning to the workforce. Clearly there is a need for additional veteran training programs to help with the huge influx of returning veterans. Other institutions considering implementation of similar training programs should consider the importance of human factors as a framework to increase instructional quality and the competency measurements of the participants. Keywords: Veteran; Software quality assurance; Human factor; Training; Software testing Saiqa Aleem, Luiz Fernando Capretz, Faheem Ahmed, A Digital Game Maturity Model (DGMM), Entertainment Computing, Volume 17, November 2016, Pages 55-73, ISSN 1875-9521, https://doi.org/10.1016/j.entcom.2016.08.004. (https://www.sciencedirect.com/science/article/pii/S1875952116300246) Abstract: Abstract Game development is an interdisciplinary concept that embraces artistic, software engineering, management, and business disciplines. This research facilitates a better understanding of important dimensions of digital game development methodology. Game development is considered as one of the most complex tasks in software engineering. The increased popularity of digital games, the challenges faced by game development organizations in developing quality games, and high competition in the digital game industry demand a game development maturity assessment. Consequently, this study presents a Digital Game Maturity Model to evaluate the current development methodology in an organization. The framework of this model consists of assessment questionnaires, a performance scale, and a rating method. The main goal of the questionnaires is to collect information about current processes and practices. In general, this research contributes towards formulating a comprehensive and unified strategy for game development maturity evaluation. Two case studies were conducted and their assessment results reported. These demonstrate the level of maturity of current development practices in two organizations. Keywords: Software game; Game performance; Video game; Online game; Process assessment; Software process improvement; Game development methodology Rayford B. Vaughn, Leadership by example: A perspective on the influence of Barry Boehm, Journal of Systems and Software, Volume 80, Issue 8, August 2007, Pages 1222-1226, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2006.09.045. (https://www.sciencedirect.com/science/article/pii/S0164121206002949) Abstract: Over the course of the past 10 years of working with Dr. Boehm on various projects has both influenced the software engineering program at Mississippi State University as well as provided growth opportunities and expansion of the MSU ABET accredited software engineering undergraduate degree program. Looking back over the key interactions with him, it is apparent that he leads (and influences) by his example, his work ethic, and his intellect in the software engineering field. This paper provides insights into his specific influences though collaborative work with another university. Keywords: Software engineering education and training; Empirical software engineering Dietmar Pfahl, Marco Klemm, Günther Ruhe, A CBT module with integrated simulation component for software project management education and training, Journal of Systems and Software, Volume 59, Issue 3, 15 December 2001, Pages 283-298, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(01)00069-3. (https://www.sciencedirect.com/science/article/pii/S0164121201000693) Abstract: Due to increasing demand for software project managers in industry, efforts are needed to develop the management-related knowledge and skills of the current and future software workforce. In particular, university education needs to provide to their computer science and software engineering (SE) students not only technology-related skills but, in addition, a basic understanding of typical phenomena occurring in industrial (and academic) software projects. The objective of this paper is to present concepts of a computer-based training (CBT) module for student education in software project management. The single-learner CBT module can be run using standard web-browsers (e.g. Netscape). The simulation component of the CBT module is implemented using the system dynamics (SD) simulation modelling method. The paper presents the design of the simulation model and the training scenario offered by the existing CBT module prototype. Possibilities for empirical validation of the effectiveness of the CBT module in university education are described, results of a first controlled experiment are presented and discussed, and future extensions of the CBT module towards collaborative learning environments are suggested. Keywords: Computer-based training; Controlled experiment; Empirical study; Process simulation; Software engineering education; Software project management; System dynamics Nancy Mead, Kathy Beckman, Jimmy Lawrence, George O’Mary, Cynthia Parish, Perla Unpingco, Hope Walker, Industry/university collaborations: different perspectives heighten mutual opportunities, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 155-162, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00091-6. (https://www.sciencedirect.com/science/article/pii/S0164121299000916) Abstract: In this paper, we present the results of a survey of formal industry/university collaborations. The purpose of these collaborations is to meet the software engineering education and training needs of adult learners through joint ventures such as graduate programs (degree and certificate) and professional development activities (customized classes, seminars, forums, and conferences). Members of the Software Engineering Institute (SEI) working group on software engineering education and training conducted the survey in 1997–1998. The working group drew on the extensive experience of industry and university collaboration participants to help answer practical questions about the benefits of collaboration, the collaboration process itself, successful collaboration administration and programming, and lessons learned. Survey results are being published as a service to the software engineering education and training community to assist organizations interested in forming a new collaboration or improving an existing collaboration. Keywords: Industry/university collaborations; Software engineering education and training; Collaboration benefits; Collaboration goals and measures; Collaboration process; Lessons learned David Carrington, Paul Strooper, Sharron Newby, Terry Stevenson, An industry/university collaboration to upgrade software engineering knowledge and skills in industry, Journal of Systems and Software, Volume 75, Issues 1–2, 15 February 2005, Pages 29-39, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.02.020. (https://www.sciencedirect.com/science/article/pii/S0164121204000391) Abstract: This paper describes an ongoing collaboration between Boeing Australia Limited and the University of Queensland to develop and deliver an introductory course on software engineering. The aims of the course are to provide a common understanding of the nature of software engineering for all Boeing Australia's engineering staff, and to ensure they understand the practices used throughout the company. The course is designed so that it can be presented to people with varying backgrounds, such as recent software engineering graduates, systems engineers, quality assurance personnel, etc. The paper describes the structure and content of the course, and the evaluation techniques used to collect feedback from the participants and the corresponding results. The immediate feedback on the course indicates that it has been well received by the participants, but also indicates a need for more advanced courses in specific areas. The long-term feedback from participants is less positive, and the long-term feedback from the managers of the course participants indicates a need to expand on the coverage of the Boeing-specific processes and methods. Keywords: Software engineering education; Industry training; Industry/university collaboration Jan J van Amstel, The group education and training: IT's basic courses in Philips, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 211-219, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00041-X. (https://www.sciencedirect.com/science/article/pii/S095058499800041X) Abstract: Education and Training (E & T) is one of the groups of the Information and Software Technology sector within Philips Research. The IST sector is located in Eindhoven, The Netherlands. E & T provides for education and training of the professional software developers within Philips. The two most important courses in the area of basic education are the CSO (Course on Software Development) and CSE (Course on Software Engineering). The CSO covers advanced programming methodology (computing science; programming-in-the-small) and shows the application of this methodology in larger systems. The basic knowledge as presented by the CSO should be supplemented by knowledge of and skills in `programming-in-the-large'. This is taught in the CSE. Keywords: Education and training Karen L. Medsker, Larry R. Medsker, Instructional technology: A key to successful information systems, Information & Management, Volume 12, Issue 4, April 1987, Pages 195-208, ISSN 0378-7206, https://doi.org/10.1016/0378-7206(87)90042-5. (https://www.sciencedirect.com/science/article/pii/0378720687900425) Abstract: Success and failure patterns of information systems development are widely discussed. However, the potential of education and training to address these problems has not received adequate attention. Managers and other information systems professionals need to be aware of the best technologies both for development of information systems and for the associated training of developers and end users. The best way to produce effective, efficient training is through instructional technology. Although derived from system science, as are information system development methods, the processes and techniques of instructional technology are often unknown to information systems professionals. This paper discusses the parallels between these two disciplines and points out where each could benefit from the other. Keywords: Information Systems Development; Instructional Technology; Instructional Systems; Systems Analysis; Education; Training; Software Engineering; Systems Science Remo Ferrari, Nazim H. Madhavji, Software architecting without requirements knowledge and experience: What are the repercussions?, Journal of Systems and Software, Volume 81, Issue 9, September 2008, Pages 1470-1490, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.12.764. (https://www.sciencedirect.com/science/article/pii/S0164121207003238) Abstract: Whereas the relationship between Requirements Engineering and Software Architecture (SA) has been studied increasingly in recent years in terms of methods, notations, representations, tools, development paradigms and project experiences, that in terms of the human agents conducting these processes has not been explored scientifically. This paper describes the impact of requirements knowledge and experience (RKE) on software architecting tasks. Specifically, it describes an exploratory, empirical study involving 15 architecting teams, approximately evenly split between those teams with RKE and those without. Each team developed its own system architecture from the same given set of requirements in the banking domain. The subjects were all final year undergraduate or graduate students enrolled in a university-level course on software architectures. The overall results of this study suggest that architects with RKE develop higher-quality software architectures than those without, and that they have fewer architecture-development problems than did the architects without RKE. This paper identifies specific areas of both architecture design as well as the architecture-development process where the differences manifest between the RKE and non-RKE architects. The paper also describes the possible implications of the findings on the areas of hiring and training, pedagogy, and technology. The empirical study was carried out using the “mixed methods” approach, involving both quantitative and qualitative aspects of the investigation. A bi-product of this study is an architectural assessment instrument (included in the Appendix) for quantitative analysis of the quality of a software architecture. This paper also describes some new threads for future work. Keywords: Software Architecture; Requirements knowledge and experience; Software quality; Architectural assessment instrument; Attribute Driven Design (ADD) method; Hiring and training; Software engineering curriculum; Architecture and requirements technology; Quantitative and qualitative research; Empirical study Andreas L. Symeonidis, Ioannis N. Athanasiadis, Pericles A. Mitkas, A retraining methodology for enhancing agent intelligence, Knowledge-Based Systems, Volume 20, Issue 4, May 2007, Pages 388-396, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2006.06.003. (https://www.sciencedirect.com/science/article/pii/S0950705106001687) Abstract: Data mining has proven a successful gateway for discovering useful knowledge and for enhancing business intelligence in a range of application fields. Incorporating this knowledge into already deployed applications, though, is highly impractical, since it requires reconfigurable software architectures, as well as human expert consulting. In an attempt to overcome this deficiency, we have developed Agent Academy, an integrated development framework that supports both design and control of multi-agent systems (MAS), as well as “agent training”. We define agent training as the automated incorporation of logic structures generated through data mining into the agents of the system. The increased flexibility and cooperation primitives of MAS, augmented with the training and retraining capabilities of Agent Academy, provide a powerful means for the dynamic exploitation of data mining extracted knowledge. In this paper, we present the methodology and tools for agent retraining. Through experimented results with the Agent Academy platform, we demonstrate how the extracted knowledge can be formulated and how retraining can lead to the improvement – in the long run – of agent intelligence. Keywords: Data mining; Multi-agent systems; Agent intelligence; Training; Retraining Guoping Rong, He Zhang, Bohan Liu, Qi Shan, Dong Shao, A replicated experiment for evaluating the effectiveness of pairing practice in PSP education, Journal of Systems and Software, Volume 136, February 2018, Pages 139-152, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.08.011. (https://www.sciencedirect.com/science/article/pii/S0164121217301668) Abstract: Abstract Background: Handling large-sized classes is one of the major challenges in Personal Software Process (PSP) education in a tertiary education environment. We applied a pairing approach in PSP education and managed to mitigate the size challenge without sacrificing education effectiveness, which has been verified in an experiment in 2010 (PSP2010). However, there are several issues (e.g., mutual interference among student pairs, confusing evaluation comments, untraceable corrections, etc.) existing in this experiment, which may create mist towards proper understanding of the education approach. Objective: In order to address the identified issues and better understand both pros and cons of the pairing approach, we replicated the experiment in 2014. Method: With new lab arrangement and evaluation mechanism devised, the replication (PSP2014) involved 120 students after their first academic year, who were separated into two groups with 40 pairs of students in one group and 40 solo students in the other. Results: Results of the replication include: 1) paired students conformed process discipline no worse (sometime better) than solo students; 2) paired students performed better than solo students in the final exam; 3) both groups spent comparable amount of time in preparing submissions; 4) both groups performed similar in size estimation and time estimation of the course assignments; 5) the quality of the programs developed by paired students is no less (sometime better) than solo students. Conclusion: The replication together with the original study confirms that, as an education approach, the pairing practice could reduce the amount of submissions required in a PSP training without sacrificing (sometime improving) the education effectiveness. Keywords: Personal software process; Software engineering education; Replication Alexei Lisounkin, Alexander Sabov, Gerhard Schreck, On Web-Based Architectures for Simulation Supported Training in Technical Networks, IFAC Proceedings Volumes, Volume 37, Issue 5, June 2004, Pages 173-178, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)32362-5. (https://www.sciencedirect.com/science/article/pii/S1474667017323625) Abstract: Abstract The paper presents an approach for the cost effective implementation of simulationsupported training for plant operators. Simulation functions are seen as practical support for human cooperationwith complex automation systems. A web-based system architecture is proposed and a generic approach dedicated tomodeling and simulation of technical networks is applied. The concept has been developed within a project of the Fraunhofer Association e-Industrial Services. Aspects of system implementation and application are outlined. Keywords: Water industry; Web-based training; Software architectures; Process simulators; XML Santi Caballé, Fatos Xhafa, CLPL: Providing software infrastructure for the systematic and effective construction of complex collaborative learning systems, Journal of Systems and Software, Volume 83, Issue 11, November 2010, Pages 2083-2097, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2010.06.013. (https://www.sciencedirect.com/science/article/pii/S0164121210001652) Abstract: Over the last decade, e-Learning and in particular Computer-Supported Collaborative Learning (CSCL) needs have been evolving accordingly with more and more demanding pedagogical and technological requirements. As a result, high customization and flexibility are a must in this context, meaning that collaborative learning practices need to be continuously adapted, adjusted, and personalized to each specific target learning group. These very demanding needs of the CSCL domain represent a great challenge for the research community on software development to satisfy. This contribution presents and evaluates a previous research effort in the form of a generic software infrastructure called Collaborative Learning Purpose Library (CLPL) with the aim of meeting the current and demanding needs found in the CSCL domain. To this end, we experiment with the CLPL in order to offer an advanced reuse-based service-oriented software engineering methodology for developing CSCL applications in an effective and timely fashion. A validation process is provided by reporting on the use of the CLPL platform as the primary resource for the Master's thesis courses at the Open University of Catalonia when developing complex software applications in the CSCL domain. The ultimate aim of the whole research is to yield effective CSCL software systems capable of supporting and enhancing the current on-line collaborative learning practices. Keywords: Software architecture and design; Software engineering methods; Software reuse; Component-based software engineering; Model-driven engineering; Service orientation; SOA; Computer-supported collaborative learning; E-learning; Software and systems education Manal M. Alhammad, Ana M. Moreno, Gamification in software engineering education:a systematic mapping, Journal of Systems and Software, Available online 29 March 2018, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2018.03.065. (https://www.sciencedirect.com/science/article/pii/S0164121218300645) Abstract: Abstract The potential of gamification in education is based on the hypothesis that it supports and motivates students and can thus lead to enhanced learning processes and outcomes. Gamification in software engineering (SE) education is in its infancy. However, as SE educators we are particularly interested in understanding how gamification is pollinating our field and the extent to which the above claim is valid in our context. A systematic literature mapping has underscored the difficulty in fully corroborating the above claim because few empirical data are available so far. However, key trends and challenges have been identified. We found that the purpose of applying gamification in the SE field is mostly directly related to improving student engagement and, to a lesser extent, to improving student knowledge, although other targets are the application of SE best practices and socialization. We have also discussed insightful issues regarding the implementation cost of gamification, patterns in the most often used gamification elements, and the SE processes and teaching activities addressed. Of the identified challenges, we should highlight the complexity of deciding which gamification approach to follow, the lack of information for choosing gamification elements and the need to control the impact of gamification. Keywords: Gamification; Software Engineering; Education; Systematic Mapping Roelof K. Brouwer, Witold Pedrycz, Training a feed-forward network with incomplete data due to missing input variables, Applied Soft Computing, Volume 3, Issue 1, July 2003, Pages 23-36, ISSN 1568-4946, https://doi.org/10.1016/S1568-4946(03)00003-6. (https://www.sciencedirect.com/science/article/pii/S1568494603000036) Abstract: Data available for training a neural network may be deficient not only in quantity of data but entire independent variables with their data may be missing such as is often the situation for software engineering data. This may cause the relation based on the available data to exhibit the property of one-to-many (o-m) valuedness or almost o-m valuedness. Multiplayer perceptrons or feed-forward network however are generally trained to represent functions or m-o mappings. The solution consists of adding another input to the standard feed-forward network and of modifying the training algorithm to allow for determination of this input for which no training data is available. If the values for the additional input are restricted to a discrete set then they may be perceived as cluster identifiers and the training method may be perceived as another form of clustering or segmentation of the input. If the missing input variable is assumed to have a finite number of values the method proposed here may be compared to mixture of experts and mixture density networks except that the proposed method is more direct solution. The modified feed-forward network and training method has been successfully applied to several examples. Keywords: Feed forward neural networks; Relations; One-to-many mappings; Data segmentation; Clustering; Many-valued functions; Missing inputs; Incomplete training data M. Sharples, N. Jeffery, J.B.H. du Boulay, D. Teather, B. Teather, G.H. du Boulay, Socio-cognitive engineering: A methodology for the design of human-centred technology, European Journal of Operational Research, Volume 136, Issue 2, 16 January 2002, Pages 310-323, ISSN 0377-2217, https://doi.org/10.1016/S0377-2217(01)00118-7. (https://www.sciencedirect.com/science/article/pii/S0377221701001187) Abstract: We describe a general methodology, socio-cognitive engineering, for the design of human-centred technology. It integrates software, task, knowledge and organizational engineering and has been refined and tested through a series of projects to develop computer systems to support training and professional work. In this paper we describe the methodology and illustrate its use through a project to develop a computer-based training system for neuro-radiology. Keywords: Human-centred technology; Software engineering; Organizational engineering; Human–computer interaction; Computer-based training; Radiology Mauricio R. de A. Souza, Lucas Veado, Renata Teles Moreira, Eduardo Figueiredo, Heitor Costa, A systematic mapping study on game-related methods for software engineering education, Information and Software Technology, Volume 95, March 2018, Pages 201-218, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2017.09.014. (https://www.sciencedirect.com/science/article/pii/S0950584917303518) Abstract: AbstractContext The use of games in software engineering education is not new. However, recent technologies have provided new opportunities for using games and their elements to enhance learning and student engagement. Objective The goal of this paper is twofold. First, we discuss how game-related methods have been used in the context of software engineering education by means of a systematic mapping study. Second, we investigate how these game-related methods support specific knowledge areas from software engineering. By achieving these goals, we aim not only to characterize the state of the art on the use of game-related methods on software engineering education, but also to identify gaps and opportunities for further research. Method We carried out a systematic mapping study to identify primary studies which address the use, proposal or evaluation of games and their elements on software engineering education. We classified primary studies based on type of approaches, learning goals based on software engineering knowledge areas, and specific characteristics of each type of approach. Results We identified 156 primary studies, published between 1974 and June 2016. Most primary studies describe the use of serious games (86) and game development (57) for software engineering education, while Gamification is the least explored method (10). Learning goals of these studies and their development of skills are mostly related to the knowledge areas of “Software Process”, “Software Design”, and “Professional Practices”. Conclusions The use of games in software engineering education is not new. However, there are some knowledge areas where the use of games can still be further explored. Gamification is a new trend and existing research in the field is quite preliminary. We also noted a lack of standardization both in the definition of learning goals and in the classification of game-related methods. Keywords: Software engineering education; Game-based learning; Gamification; Game development based learning Otávio Augusto Lazzarini Lemos, Fábio Fagundes Silveira, Fabiano Cutigi Ferrari, Alessandro Garcia, The impact of Software Testing education on code reliability: An empirical assessment, Journal of Systems and Software, Volume 137, March 2018, Pages 497-511, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.02.042. (https://www.sciencedirect.com/science/article/pii/S0164121217300419) Abstract: Abstract Software Testing (ST) is an indispensable part of software development. Proper testing education is thus of paramount importance. Indeed, the mere exposition to ST knowledge might have an impact on programming skills. In particular, it can encourage the production of more correct - and thus reliable - code. Although this is intuitive, to the best of our knowledge, there are no studies about such effects. Concerned with this, we have conducted two investigations related to ST education: (1) a large experiment with students to evaluate the possible impact of ST knowledge on the production of reliable code; and (2) a survey with professors that teach introductory programming courses to evaluate their level of ST knowledge. Our study involved 60 senior-level computer science students, 8 auxiliary functions with 92 test cases, a total of 248 implementations, and 53 professors of diverse subfields that completed our survey. The investigation with students shows that ST knowledge can improve code reliability in terms of correctness in as much as 20%, on average. On the other hand, the survey with professors reveals that, in general, university instructors tend to lack the same knowledge that would help students increase their programming skills toward more reliable code. Keywords: Software Testing; Computer science education; Student experiments Ivan Mustakerov, Daniela Borissova, A conceptual approach for development of educational Web-based e-testing system, Expert Systems with Applications, Volume 38, Issue 11, October 2011, Pages 14060-14064, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2011.04.214. (https://www.sciencedirect.com/science/article/pii/S095741741100741X) Abstract: The paper describes a conceptual approach for development of educational Web-based e-testing system. The system aims to improve the effectiveness of the learning process by providing features for flexible adjusting of the testing process. The tutors and learners can interact with the e-testing system by choosing different options for testing – easy questions, advanced questions, sequential or shuffle order, random generated questions within given range. The e-testing system can be used by students for self-testing or by tutors for official examination. When used for official examination the test results information is sent to the examiner by e-mail. The system files are stored on the server-side and are accessible through Web-browser. A prototype of e-testing system illustrating the proposed approach is developed by means of HTML and JavaScript languages. It was used for “C Programming Language” course students self-testing and official examination. The experimental results demonstrated that the implementation of the proposed approach is quite helpful to facilitate the understanding and implementation of teachers’ attitudes toward Web-based e-testing tools application. Keywords: Educational information system; Web-based system; Software architecture; Interactive learning environments; Self-testing knowledge Carlos A. Jara, Francisco A. Candelas, Fernando Torres, Christophe Salzmann, Denis Gillet, Francisco Esquembre, Sebastián Dormido, Synchronous collaboration between auto-generated WebGL applications and 3D virtual laboratories created with Easy Java Simulations, IFAC Proceedings Volumes, Volume 45, Issue 11, 2012, Pages 160-165, ISSN 1474-6670, https://doi.org/10.3182/20120619-3-RU-2024.00039. (https://www.sciencedirect.com/science/article/pii/S1474667015375960) Abstract: Abstract This paper presents a new collaborative e-learning system based on a real-time synchronized communication among virtual laboratories. This original approach provides a new tool which integrates virtual laboratories inside a synchronous collaborative e-learning framework. This system is based on the automatic generation of WebGL simulations from 3D Easy Java Simulations' applets. These WebGL simulations created are synchronized through the Internet in order to integrate them in an on-line collaborative environment. In this way, several students can attend in a virtual class using only a communication device using any web-browser with WebGL enabled. The paper also describes the software architecture in which is based on this new education tool. Keywords: collaborative environment; distance teaching; Java 3D; virtual laboratories; WebGL Marcus Deininger, Anke Drappa, SESAM - A Simulation System for Project Managers, IFAC Proceedings Volumes, Volume 29, Issue 2, September 1996, Pages 77-82, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)43781-5. (https://www.sciencedirect.com/science/article/pii/S1474667017437815) Abstract: Abstract Project management is an adventurous task: Just like an explorer making his way through foreign land, project managers have to be well-prepared and trained to reduce risks and turn the adventure into a foreseeable enterprise. For project manager training, we initiated SESAM (Software Engineering Simulation by Animated Models). SESAM is our software engineering research and education project; we simulate software projects as interactive adventure games. Trainees can playfully gain experiences and try different scenarios. SESAM comprises a modelling approach, a notation, and a programming system which supports model building, animation and validation. A first, simple simulation model has been used in a project management training course. The model turned out to be sufficient to an amazing extent: Many effects well-known from real projects arose during the simulation. The participants made many characteristic errors which led to plausible project distortions. The evolution of the simulated projects provided valuable feedback to teachers and participants. Keywords:: :Educational aids; Process simulator; Project management; Training Affan Yasin, Lin Liu, Tong Li, Jianmin Wang, Didar Zowghi, Design and preliminary evaluation of a cyber Security Requirements Education Game (SREG), Information and Software Technology, Volume 95, March 2018, Pages 179-200, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2017.12.002. (https://www.sciencedirect.com/science/article/pii/S0950584917301921) Abstract: Abstract Context: Security, in digitally connected organizational environments of today, involves many different perspectives, including social, physical, and technical factors. In order to understand the interactions among these correlated aspects and elicit potential threats geared towards a given organization, different security requirements analysis approaches are proposed in the literature. However, the body of knowledge is yet to unleash its full potential due to the complex nature of security problems, and inadequate ways to improve security awareness of key players in the organization. Objective: Objective(s) of the research study is to improve the security awareness of players utilizing serious games via: (i) Know-how of security concepts and security protection; (ii) guided process of identifying valuable assets and vulnerabilities in a given organizational setting; (iii) guided process of defining successful security attacks to the organization. Method: Important methods used to address the above objectives include: (i) a comprehensive review of the literature to better understand security and game design elements; (ii) designing a serious game using cyber security knowledge and game-based techniques combined with security requirements engineering concepts; (iii) using empirical evaluation (observation and survey) to verify the effectiveness of the proposed game design. Result: The solution proposed is a serious game for security requirements education, which: (i) can be an effective and fun way of learning security related concepts; (ii) mimics a real life problem setting in a presentable and understandable way; (iii) motivates players to learn more about security related concepts in future. Conclusion: From this study, we conclude that the proposed Security Requirement Education Game (SREG) has positive results and is helpful for players of the game to get an understanding of security attacks and vulnerabilities. Keywords: Organizational security; Security requirements inception; Requirements engineering; Security awareness; Security education; Serious game; Social engineering; Cyber security; Empirical study Axel Junger, Achim Michel, Matthias Benson, Lorenzo A Quinzio, Johannes Hafer, Bernd Hartmann, Patrick Brandenstein, Kurt Marquardt, Gunter Hempelmann, Evaluation of the suitability of a patient data management system for ICUs on a general ward, International Journal of Medical Informatics, Volume 64, Issue 1, November 2001, Pages 57-66, ISSN 1386-5056, https://doi.org/10.1016/S1386-5056(01)00202-7. (https://www.sciencedirect.com/science/article/pii/S1386505601002027) Abstract: The development of the ICUData patient data management system (PDMS) for intensive care units (ICU), by IMESO GmbH, Hüttenberg, Germany, was based on the assumption that processes and therapies at ICU are the most complex with the highest data density compared with those in other wards. Based on experience with the system and on a survey conducted among users at our pain clinic, we evaluated whether the concept of the present software architecture, which sufficiently reproduces processes and data at an ICU, is suitable as a PDMS for general wards. The highly modular and client-centric approach of the PDMS is founded on a message-based communications architecture (HL7). In the beginning of the year 2000, the system was implemented at the pain management clinic (12 beds) of our hospital. To assess its user friendliness, we conducted a survey of medical staff (n=14). From April 1st 2000 to August 31st 2000, all clinical and administrative data of 658 patients at the pain management clinic were recorded with the PDMS. From the start, all users had access to data and information of other connected data management systems of the hospital (e.g. patient administrative data, patient clinical data). Staff members found the system mostly useful, clearly presented, practical, and easy to learn and use. Users were relatively satisfied with stability and performance of the program but mentioned having only limited knowledge of the program's features. The need for external support during a computer crash was rated negatively. Despite the need for further usage training and improved program performance, the software architecture described seems to be a promising starting point for the construction of a PDMS for general wards. Keywords: Medical record systems; Computerized-digital patient file; Hospital information system; User-computer interface; Computer user training Jari Vanhanen, Timo O.A. Lehtinen, Casper Lassenius, Software engineering problems and their relationship to perceived learning and customer satisfaction on a software capstone project, Journal of Systems and Software, Volume 137, March 2018, Pages 50-66, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.11.021. (https://www.sciencedirect.com/science/article/pii/S0164121217302716) Abstract: Abstract In educational projects, having students encounter problems is desirable, if it increases learning. However, in capstone projects with industrial customers, negative effects problems can have on customer satisfaction must be considered. We conducted a survey in a capstone project course in order to study problems, learning and customer satisfaction related to eleven software engineering topics. On the average, students working in the managerial roles learned quite a lot about each topic, and the developers learned moderately, but the degree of learning varied a lot among the teams, and among the team members. The most extensively encountered problems were related to testing, task management, effort estimation and technology skills. The developers contributed quite a lot to solving problems with technology skills, but only moderately or less with other topics, whereas the managers contributed quite a lot with most of the topics. Contributing to solving problems increased learning moderately for most of the topics. The increases were highest with maintaining motivation and technology skills. Encountering problems with task management, customer expectations and customer communication affected customer satisfaction very negatively. When considering both learning and customer satisfaction, the best topics to encounter problems in were effort estimation, testing, and technology skills. Keywords: Capstone project; Education; Learning; Customer satisfaction; Problems; Software engineering U. Klein, Simulation-based distributed systems: serving multiple purposes through composition of components, Safety Science, Volume 35, Issues 1–3, June 2000, Pages 29-39, ISSN 0925-7535, https://doi.org/10.1016/S0925-7535(00)00020-5. (https://www.sciencedirect.com/science/article/pii/S0925753500000205) Abstract: The paper presents a new information technology approach for simulation-based systems capable of serving multiple purposes through the composition of components. Based on the High Level Architecture for Modeling and Simulation (HLA) software architecture for distributed simulation systems, this approach takes into account the need for flexibility, re-usability and interoperability in order to be applicable in different operational modes such as system design and development, operation, training as well as risk and scenario management. A prototype dispatching system for public transportation demonstrates the concept. Keywords: High Level Architecture for Modeling and Simulation; Distributed simulation; Interoperability; Multi-purpose; Training; Risk assessment Shimaa Ouf, Mahmoud Abd Ellatif, S.E. Salama, Yehia Helmy, A proposed paradigm for smart learning environment based on semantic web, Computers in Human Behavior, Volume 72, July 2017, Pages 796-818, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2016.08.030. (https://www.sciencedirect.com/science/article/pii/S0747563216305957) Abstract: Abstract The current approaches of e-learning face challenges, in isolation of learners from learning process, and shortage of learning process quality. The researchers mentioned that the next generation of e-learning is e-learning ecosystem. E-learning ecosystem has many advantages, in which, learners form groups, collaborate with each other and with educators, and content designed for interaction. E-learning ecosystem faces some issues. It applies teacher-student model, in which, fixed learning pathway is considered suitable for all learners. Consequently, learners are presented with limited personalized materials. E-learning ecosystem needs to merge the personalization's concept. Semantic web ontology based personalization of learning environment plays a leading role to build smart e-learning ecosystem. This paper previews a detailed study which addresses research papers that apply ontology within learning environment. Most of these studies focus on personalizing e-learning by providing learners with suitable learning objects, ignoring the other learning process components. This paper proposes and implements framework for smart e-learning ecosystem using ontology and SWRL. A new direction is proposed. This direction fosters the creation of a separate four ontologies for the personalized full learning package which is composed of learner model and all the learning process components (learning objects, learning activities and teaching methods). Keywords: E-Learning ecosystem; Personalization; Ontology; Software architecture; Semantic Web Rule Language; Learner model Tony Carroll, A strategy for empowerment: The role of midwives in computer systems implementation, Computer Methods and Programs in Biomedicine, Volume 54, Issues 1–2, September 1997, Pages 101-113, ISSN 0169-2607, https://doi.org/10.1016/S0169-2607(97)00039-4. (https://www.sciencedirect.com/science/article/pii/S0169260797000394) Abstract: The procurement and implementation of patient administration systems has been done on numerous occasions in the past. The Rotunda project however encompassed major bespoke clinical developments which were going to impact upon large clusters of midwives, and medical staff to a lesser extent. A broad based four level structured methodology was used to implement the project which is significantly ahead of schedule. This methodology together with its strengths and weaknesses is comprehensively discussed. The empowerment of midwives, their roles in systems analysis and design, software testing and organisational re-engineering is described. The importance of undertaking comprehensive computer training is highlighted and a compact 10 h information technology course coupled with ongoing educational and related activities which could be adopted by any organisation is documented. The seven deadly sins of project management are mapped out. An update on benefits realisation is provided. Gender issues are also discussed. Keywords: Midwives; Empowerment; Training; Project Management; Gender; Re-engineering Thomas Tometzki, Marten Völker, Christian Blichmann, Ernesto Elias-Nieland, Christian Sonntag, Sebastian Engell, Learn2Control: A Web-based Framework for Project-Oriented Control Education, IFAC Proceedings Volumes, Volume 41, Issue 2, 2008, Pages 14624-14629, ISSN 1474-6670, https://doi.org/10.3182/20080706-5-KR-1001.02477. (https://www.sciencedirect.com/science/article/pii/S147466701641342X) Abstract: Abstract Important skills of a control engineer are the ability to subdivide complex control design projects into smaller design steps and to solve the subproblems, taking into account the interdependencies between the subtasks as well as the overall goals and requirements. In this paper, the web-based learning environment Learn2Control is presented which has been developed to complement classical teaching methods in the control engineering education at Technische Universität Dortmund. Learn2Control provides the students with the opportunity to apply their knowledge of control theory and to gain experience in project-oriented workflows by means of authentic case studies on control systems design. The didactic concept aims at teaching the dependencies and the interactions between the solution of the subtasks of modeling, analysis, and controller design, and between the methods used for these tasks. In the latest version of Learn2Control, only a standard web browser and Java have to be installed on the client computers. The modeling, analysis and design tasks are processed on web pages which are generated by Java server technology on a web server. For mathematical operations, a custom multi-user Matlab web service was developed. Currently, three control design projects are available within the new framework. In addition, four projects are available in a previous version of the framework that requires a Matlab installation on the client side. Keywords: e-learning; control education; project-oriented learning; software architectures Eran Lasser, Training personnel in the Israeli Defence Forces computer community, Education and Computing, Volume 6, Issues 1–2, July 1990, Pages 81-89, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(05)80052-4. (https://www.sciencedirect.com/science/article/pii/S0167928705800524) Abstract: This paper describes the training course for computer professionals in the Israeli Defence Forces (IDF) during their in-service years. It briefly reviews the development of the IDF computer community from the late 1950s onwards, offering a general description of the tasks carried out by this organization. It then gives a description of the training schemes for the software professional, beginning with his passing the Programming course, through Design Introduction and System Design, and ending with System Analysis. It also reviews the strict selection methods that are used to ensure a higher level of course graduates, to provide professionals on a par with the best in the Israeli computer community. In addition, a survey is presented of other courses that provide specific professional training in the different types of hardware and software. The paper also reviews the steps taken to assimilate software engineering and CASE in the IDF computer community, including the selection of a CASE tool for computer training. Keywords: Computer-aided software engineering (CASE); Training; Software engineering; Design; System analysis; Programming Ulrich Bosler, David Squires, Training teachers to design educational software: An international collaboration based in the Federal Republic of Germany, Education and Computing, Volume 5, Issues 1–2, 1989, Pages 49-53, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(89)80010-X. (https://www.sciencedirect.com/science/article/pii/S016792878980010X) Abstract: This paper describes a training programme for Computer Assisted Learning (CAL) authors designed to establish an infrastructure for educational software development in the Federal Republic of Germany (FRG). The programme aims:o(1) to establish a core of CAL authors distributed over several states (‘Länder’) of the FRG, and (2) to produce a set of exemplar CAL packages. The adaptation of existing software from England, Scandinavia and the United States of America for use in the FRG has been the basis for the programme. The international nature of this software has been reflected in the international collaboration within the training programme. The programme was instigated by IPN (Institute for Science Education) Kiel, which has been responsible for coordination and administration. The design of the programme and the tutoring have been done in collaboration with King's College, London. Key Words: Educational software; Software design model; Adaptation of software; National infrastructure; Educational publishers; Teacher training through experience John Sum, Gilbert H. Young, Wing-kay Kan, Imprecise Neural Computation in Real-Time Neural System Design, IFAC Proceedings Volumes, Volume 31, Issue 14, June 1998, Pages 123-128, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)44883-X. (https://www.sciencedirect.com/science/article/pii/S147466701744883X) Abstract: Abstract In order to solve real-time control problems, a good software design together with an appropriate control scheme and a system identification method are extremely importance. To facilitate the software design to cope with such a time critical system, the concept of imprecise and approximate computation has been imposed and applied in real-time scheduling problems for more than a decade. Applying neural network to solve real-time problem is always a problem to neural network practitioners. In this paper, a principle for neural computation and real-time system 一 imprecise neural computation 一 will be presented. This principle extends the idea of imprecise computation in real-time systems by introducing concepts like mandatory neural structure and imprecise pruning. Using such concepts, it is able to design and analyze a real-time neural system for different real-time applications. Keywords: Imprecise computation; Model complexity; Neural computation; Pruning; Real-time systems; Training Jan-Philipp Steghöfer, Håkan Burden, Hiva Alahyari, Dominik Haneberg, No silver brick: Opportunities and limitations of teaching Scrum with Lego workshops, Journal of Systems and Software, Volume 131, September 2017, Pages 230-247, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.06.019. (https://www.sciencedirect.com/science/article/pii/S0164121217301206) Abstract: Abstract Education in Software Engineering has to both teach technical content such as databases and programming but also organisational skills such as team work and project management. While the former can be evaluated from a product perspective, the latter are usually embedded in a Software Engineering process and need to be assessed and adapted throughout their implementation. The in-action property of processes puts a strain on teachers since we cannot be present throughout the students’ work. To address this challenge we have adopted workshops to teach Scrum by building a Lego city in short sprints to focus on the methodological content. In this way we can be present throughout the process and coach the students. We have applied the exercise in six different courses, across five different educational programmes and observed more than 450 participating students. In this paper, we report on our experiences with this approach, based on quantitative data from the students and qualitative data from both students and teachers. We give recommendations for learning opportunities and best practices and discuss the limitations of these workshops in a classroom setting. We also report on how the students transferred their methodological knowledge to software development projects in an academic setting. Keywords: Scrum; Agile software engineering; Software engineering education M. Corrado, L. De Vito, H. Ramos, J. Saliga, Hardware and software platform for ADCWAN remote laboratory, Measurement, Volume 45, Issue 4, May 2012, Pages 795-807, ISSN 0263-2241, https://doi.org/10.1016/j.measurement.2011.12.003. (https://www.sciencedirect.com/science/article/pii/S0263224111004404) Abstract: In this paper an innovative hardware and software platform, called ADCWAN (Analog to Digital Converters on Wide Area Network), concerning the electronic measurement field is presented. In particular, ADCWAN is a pioneering networking cooperative environment regarding Analog to Digital Converter (ADC) testing. The hardware and software architectures of ADCWAN are described in detail and some ADC tests using the same test setup are presented. ADCWAN is distributed on a wide geographic area providing theoretical and practical tools to characterize ADCs. In particular, ADCWAN is a new approach to promote the harmonization of standards existing for ADCs, it establishes a collaborative work environment supporting the scientific research community to improve the harmonization level, allowing the scientific training of young researchers, the dissemination and the comparison of metrological information. In the paper the goals and benefits of ADCWAN toward research community are summarized. Keywords: Analog to digital converters; Distance learning; Remote laboratory; Standard harmonization S. Angelov, P. de Beer, Designing and applying an approach to software architecting in agile projects in education, Journal of Systems and Software, Volume 127, May 2017, Pages 78-90, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.01.029. (https://www.sciencedirect.com/science/article/pii/S0164121217300195) Abstract: Abstract Software architecting activities are not discussed in most agile software development methods. That is why, the combination of software architecting and agile methods has been in the focus of numerous publications. However, there is little literature on how to approach software architecting in agile projects in education. In this paper, we present our approach to the introduction of software architecting activities in an agile project course. The approach is based on literature sources and is tailored to fit our educational goals and context. The approach has been applied in two consecutive executions of the course. We observe improved understanding on the value of architecting activities and appreciation among students on the combination of architecting activities and agile development. We applied the approach predominantly in cases with an architecturally savvy Product Owner. Further research is required to understand how the approach performs in scenarios with architecturally unsavvy Product Owners and if it needs to be adapted for these scenarios. We also conclude that more research is needed on the challenges that architects face in agile projects in order to better prepare students for practice. Keywords: Software architecture; Agile; Scrum; Teaching; Software engineering education; Students; Project; Course M.J. Wells, Engineer to Software Engineer - A Practical Retraining Approach, IFAC Proceedings Volumes, Volume 16, Issue 6, 1983, Pages 263-265, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)64374-X. (https://www.sciencedirect.com/science/article/pii/S147466701764374X) Abstract: Abstract The increasing availability of in-house mini-computer power encouraged a local manufacturinq company to consider the use of such equipment to aid their engineering processes, retaining the link to the remote parent company's computer installation for its data processing requirements. The introduction of such equipment immediately provided a resource which if managed carefully could result in some very valuable software being produced for the company, but if mismanaged could produce many virtually useless, very labour intensive, one-off programs which were only of benefit to the programmer. Since none of the engineers had been trained in software it was decided to undertake a rigorous training programme in conjunction with Chelmer-Essex Institute. This paper describes the case study outlined above placing emphasis on the training design process and the general strategies and models used to effect this training programme. Keywords: Computer software; teaching; software engineering; programming languages; training Merle P Martin, William L Fuerst, Effect of computer knowledge on user performance over time, Information and Software Technology, Volume 30, Issue 9, November 1988, Pages 561-566, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(88)90135-8. (https://www.sciencedirect.com/science/article/pii/0950584988901358) Abstract: An experiment was conducted to assess the influence of computer semantic (conceptual) knowledge on user performance. The results indicate computer semantic knowledge differences influence user performance on computer models. However, the use of human factors software design can mitigate that influence especially during the initial periods of computer model use, as supported by the results. This study has important implications for software design and for industrial and academic training programs. Keywords: software development; software design; computer training Boriss Misnevs, Vacius Jusas, José Luis Fernández Alemán, Nadezhda Kafadarova, Remote Evaluation of Software Engineering Competences, Procedia Computer Science, Volume 104, 2017, Pages 20-26, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2017.01.047. (https://www.sciencedirect.com/science/article/pii/S1877050917300480) Abstract: Abstract The paper focuses on and examines the issues and problems related to remote evaluation of software engineering competences using progressive competence representation model. Authors suggested original approach for Master Program in Software Engineering competence evaluation as a combination academic competences and professional competences from European Competence model (e-CF). Examples of competence description for 16 subjects from proposed a Joint Master Program in Software Engineering are developed. Several types of scoring rubrics for Software Engineering competences evaluation are reviewed and rubrics’ templates created. The developed models and templates can be used by universities and IT enterprises for training results evaluation as well as for competence evaluation for Software Engineering Master program's graduates. Keywords: Software engineering; Competence evaluation; Scoring rubrics; e-CF Pablo Gómez-Abajo, Esther Guerra, Juan de Lara, A domain-specific language for model mutation and its application to the automated generation of exercises, Computer Languages, Systems & Structures, Volume 49, September 2017, Pages 152-173, ISSN 1477-8424, https://doi.org/10.1016/j.cl.2016.11.001. (https://www.sciencedirect.com/science/article/pii/S147784241630094X) Abstract: Abstract Model-Driven Engineering (MDE) is a software engineering paradigm that uses models as main assets in all development phases. While many languages for model manipulation exist (e.g., for model transformation or code generation), there is a lack of frameworks to define and apply model mutations. A model mutant is a variation of an original model, created by the application of specific model mutation operations. Model mutation has many applications, for instance, in the areas of model transformation testing, model-based testing or education. In this paper, we present a domain-specific language called Wodel for the specification and generation of model mutants. Wodel is domain-independent, as it can be used to generate mutants of models conformant to arbitrary meta-models. Its development environment is extensible, permitting the incorporation of post-processors for different applications. In particular, we describe Wodel-Edu, a post-processing extension directed to the automated generation of exercises for particular domains and their automated correction. We show the application of Wodel-Edu to the generation of exercises for deterministic automata, and report on an evaluation of the quality of the generated exercises, obtaining overall good results. Keywords: Model-Driven Engineering; Domain-Specific Languages; Model mutation; Education; Automatic exercise generation and correction Hannu Saarinen, Juha Tiitinen, Liisa Aha, Ali Muhammad, Jouni Mattila, Mikko Siuko, Matti Vilenius, Jorma Järvenpää, Mike Irving, Carlo Damiani, Luigi Semeraro, Optimized hardware design for the divertor remote handling control system, Fusion Engineering and Design, Volume 84, Issues 7–11, June 2009, Pages 1666-1670, ISSN 0920-3796, https://doi.org/10.1016/j.fusengdes.2008.11.050. (https://www.sciencedirect.com/science/article/pii/S0920379608003839) Abstract: A key ITER maintenance activity is the exchange of the divertor cassettes. One of the major focuses of the EU Remote Handling (RH) programme has been the study and development of the remote handling equipment necessary for divertor exchange. The current major step in this programme involves the construction of a full scale physical test facility, namely DTP2 (Divertor Test Platform 2), in which to demonstrate and refine the RH equipment designs for ITER using prototypes. The major objective of the DTP2 project is the proof of concept studies of various RH devices, but is also important to define principles for standardizing control hardware and methods around the ITER maintenance equipment. This paper focuses on describing the control system hardware design optimization that is taking place at DTP2. Here there will be two RH movers, namely the Cassette Multifuctional Mover (CMM), Cassette Toroidal Mover (CTM) and assisting water hydraulic force feedback manipulators (WHMAN) located aboard each Mover. The idea here is to use common Real Time Operating Systems (RTOS), measurement and control IO-cards etc. for all maintenance devices and to standardize sensors and control components as much as possible. In this paper, new optimized DTP2 control system hardware design and some initial experimentation with the new DTP2 RH control system platform are presented. The proposed new approach is able to fulfil the functional requirements for both Mover and Manipulator control systems. Since the new control system hardware design has reduced architecture there are a number of benefits compared to the old approach. The simplified hardware solution enables the use of a single software development environment and a single communication protocol. This will result in easier maintainability of the software and hardware, less dependence on trained personnel, easier training of operators and hence reduced the development costs of ITER RH. Keywords: CMM; DTP2; Hardware architecture; Software architecture; Multi-core Jörg Schumann, Strategy and experience in the training of staff involved in the process of application software development, Education and Computing, Volume 6, Issues 1–2, July 1990, Pages 29-32, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(05)80043-3. (https://www.sciencedirect.com/science/article/pii/S0167928705800433) Abstract: More and more advanced end-users are participating in the process of application software development. This paper presents a constructive approach to the specification of new skills which are expected of end-users who develop their own application software. The principal innovation of the approach is the use of questions to define software engineering (SE) elements and their relation. Based on the SE elements, we shall also describe the interface between the end-users and the DP professionals. Keywords: Training strategy; Application software development; Software engineering; Quality assurance; Life cycle model; Requirement specification; Prototyping; User participation; Advanced end-user; Data processing professional; New skills; Teachware Ezequiel Scott, Guillermo Rodríguez, Álvaro Soria, Marcelo Campo, Towards better Scrum learning using learning styles, Journal of Systems and Software, Volume 111, January 2016, Pages 242-253, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2015.10.022. (https://www.sciencedirect.com/science/article/pii/S0164121215002265) Abstract: Abstract Considerable attention has been paid to teaching Scrum in software engineering education as an academic response to the software industry’s demands. In order to reinforce and strengthen the understanding of Scrum concepts, professors should personalize the learning process, catering for students’ individual learning characteristics. To address this issue, learning styles become effective to understand students’ different ways of learning. In this context, the meshing hypothesis claims that when both teaching and learning styles are aligned, the students’ learning experience is enhanced. However, the literature fails to evidence support for the meshing hypothesis in the context of software engineering education. We aim to corroborate the meshing hypothesis by using teaching strategies matching the Felder–Silverman Learning Style Model in a Scrum course. Based on previous findings, we focused on the processing dimension of the model. To validate our approach, two experiments were conducted in an undergraduate software engineering course in the academic years 2013 and 2014. We provided students with a Scrum class by applying teaching strategies suiting students’ learning style. Test results corroborate that students’ outcomes improved when receiving the strategy that match their learning styles. Our data highlight opportunities for improving software engineering education by considering the students’ learning preferences. Keywords: Agile software development; Software engineering education; Learning styles Miroslava Raspopović, Svetlana Cvetanović, Dušan Stanojević, Mateja Opačić, Software architecture for integration of institutional and social learning environments, Science of Computer Programming, Volume 129, 1 November 2016, Pages 92-102, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2016.07.001. (https://www.sciencedirect.com/science/article/pii/S0167642316300855) Abstract: Abstract As technology continues to evolve and develop, requirements for effective teaching and learning methodologies are likewise growing and changing. As a result of this trend, Learning Management Systems (LMSs) are usually adopted in certain institutions and sometimes in order to take advantage of technological developments. However, these LMSs are adapted to a certain degree and may restrain users with its set of tools and functionalities. New requirements imply the need for integration with third-party systems and tools which are often used to increase the efficacy of learning. Properly implemented technology can serve as a tool to create new opportunities for learning systems. This work focuses on the design and technological requirements of the software architecture for integrating an institutional e-learning system, an educational management system, and a social learning environment. This work proposes a software architecture that supports functionalities promoting effective teaching and learning, while giving an overview of the diversity of technologies and tools used in the proposed architecture. Keywords: Software architecture; e-Learning; Service oriented architecture Dragan Ivanović, Dušan Surla, Miroslav Trajanović, Dragan Misić, Zora Konjović, Towards the Information System for Research Programmes of the Ministry of Education, Science and Technological Development of the Republic of Serbia, Procedia Computer Science, Volume 106, 2017, Pages 122-129, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2017.03.044. (https://www.sciencedirect.com/science/article/pii/S1877050917303125) Abstract: Abstract The work describes the phases of the development of the information system on scientific activities covered by the Ministry of Education, Science and Technological Development of the Republic of Serbia. The sequential phases of the project are: Recording data on scientific institutions, researchers, and research projects financed by the Ministry, Input and evaluation of the published results achieved in the research projects financed by the Ministry, Establishing electronic services for searching, presentation, and interoperability of data on scientific activity, and Generating various reports for the different needs related to scientific activities. The information requirements are listed and the system software architecture is described. The development of the system is based on the recommendations of the organization euroCRIS. Keywords: National CRIS; Ministry of Education; Science and Technological Development; Republic of Serbia Ezequiel Scott, Guillermo Rodríguez, Álvaro Soria, Marcelo Campo, Are learning styles useful indicators to discover how students use Scrum for the first time?, Computers in Human Behavior, Volume 36, July 2014, Pages 56-64, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2014.03.027. (https://www.sciencedirect.com/science/article/pii/S0747563214001496) Abstract: Abstract Teaching agile practices is in the cutting-edge of Software Engineering education since agile methodologies are widely used in the industry. An effective strategy to teach agile practices is the use of a capstone project, in which students develop requirements following an agile methodology. To improve students’ learning experience, professors have to keep track and analyze the information generated by the students during the capstone project development. The problem here arises from the large amount of information generated in the learning process, which hinders professors to meet each student’s learning profile. Particularly, to know the students skills and preferences are key aspects on a learner-centered approach of education in order to personalize the teaching. In this work, we aim to discover the relationships between students’ performance along a Scrum-based capstone project and their learning style according to the Felder–Silverman model, towards a first step to build the profiles. To address this issue, we mined association rules from the interaction of 33 Software Engineering students with Virtual Scrum, a tool that supports the development of the capstone project in the course. In the present work we describe promising results in experiments with a case-study. Keywords: Software Engineering; Agile software development; Software Engineering education; Learning styles Carl Landwehr, Jochen Ludewig, Robert Meersman, David Lorge Parnas, Peretz Shoval, Yair Wand, David Weiss, Elaine Weyuker, Software Systems Engineering programmes a capability approach, Journal of Systems and Software, Volume 125, March 2017, Pages 354-364, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2016.12.016. (https://www.sciencedirect.com/science/article/pii/S0164121216302576) Abstract: Abstract This paper discusses third-level educational programmes that are intended to prepare their graduates for a career building systems in which software plays a major role. Such programmes are modelled on traditional Engineering programmes but have been tailored to applications that depend heavily on software. Rather than describe knowledge that should be taught, we describe capabilities that students should acquire in these programmes. The paper begins with some historical observations about the software development field. Keywords: Engineering; Education; Software education; Information systems; Software design; Software development; Software documentation F. Rodríguez, J.L. Guzmán, M. Castilla, J.A. Sánchez-Molina, M. Berenguel, A proposal for teaching SCADA systems using Virtual Industrial Plants in Engineering Education, IFAC-PapersOnLine, Volume 49, Issue 6, 2016, Pages 138-143, ISSN 2405-8963, https://doi.org/10.1016/j.ifacol.2016.07.167. (https://www.sciencedirect.com/science/article/pii/S240589631630372X) Abstract: Abstract The main objectives of SCADA (Supervisory Control And Data Acquisition) systems are the supervisory analysis of the system, control algorithms validation, and data acquisition. These systems are normally implemented according to the international standards: UNE-EN ISO 9241, ISAIOI-Human-Machine Interfaces, ISA S5, and in the case treated in this paper The Spanish Royal Decree 488/1997. This paper presents a software architecture for the development of educational laboratories, through industrial virtual plants which models and logic are implemented in Matlab® and used within LabVIEW® through an appropriate protocol. Lab VIEW® from National Instruments, a specific purpose software for this kind of applications, was used, since it allows us to provide a friendly interface, to perform communications, data acquisition and the information management. In addition, to illustrate the use of the proposed architecture, different virtual industrial plants for students of different Bachelor and Master degrees in engineering at the University of Almeria have been developed. This paper shows the different virtual industrial plants that have been developed using SCADA systems to facilitate students’ learning of basic concepts and techniques for an Industrial Informatics course. Keywords: Virtual; remote laboratories; E-learning in Control Engineering; Supervisory control; Problem-based learning; Virtual reality Yuxin Liang, G.P. Liu, Design of Remote 3D Virtual Laboratory for Education On Control System Experimentation, IFAC Proceedings Volumes, Volume 46, Issue 17, 2013, Pages 327-332, ISSN 1474-6670, https://doi.org/10.3182/20130828-3-UK-2039.00071. (https://www.sciencedirect.com/science/article/pii/S1474667015341227) Abstract: Abstract This work develops a novel educational 3D remote laboratory which allows large scale users to learn math modeling, and put their own algorithms into practice by controlling plenty of virtual test rigs. The proposed 3D-NCSLab aims at giving users a very vivid virtual presentation of real devices and a much more flexible way to have their experiment done at anytime, no matter where they are as long as they can connect to the internet. Twelve different test rigs are set into four 3D virtual laboratories according to their categories. Indoor roaming animation as a new progressive area of virtual reality technology is used in 3D-NCSLab to provide immersive feeling where users can roam in the 3D virtual labs in first-person perspective, and users can also use mouse and keyboard to change the perspective in order to achieve multi-observation. Flash technology, 3D modeling and the software architecture of 3D-NCSLab are discussed in details. A experiment example is given to illustrate the effectiveness and convenience of 3D-NCSLab. Interesting ideas for future work along this way of research are also outlined at last. Keywords: 3D remote laboratory; 3D modeling; web-based laboratory; virtual reality; control engineering education F. Rodríguez, M. Castilla, J.A. Sánchez, A. Pawlowski, Semi-virtual Plant for the Modelling, Control and Supervision of batch-processes. An example of a greenhouse irrigation system, IFAC-PapersOnLine, Volume 48, Issue 29, 2015, Pages 123-128, ISSN 2405-8963, https://doi.org/10.1016/j.ifacol.2015.11.224. (https://www.sciencedirect.com/science/article/pii/S2405896315024829) Abstract: Abstract In this work, a hardware-software architecture for control engineering education is proposed and a case of study regarding a greenhouse irrigation virtual process has been implemented. This specific process is especially useful for subjects related to industrial automation, since it is characterized by mixed dynamics (continuous/discrete) and can be controlled by a Programmable Logic Controller (PLC). In such a case, there is no need to access to the real process that is not always available for practical exercises. Moreover, the developed scheme eliminates the maintenance costs related to real process, being a very important advantage for academic institutions. The developed architecture for the industrial process simulation uses Labview, OPC, MODBUS, SolidWorks and Schneider Modicon M340 PLC for the control of virtual processes. The developed example is applied to the greenhouse fertirrigation control that enables the agricultural engineering students to design the irrigation control system. The control strategy is implemented on the PLC which interacts with the virtual process by carrying out the necessary actions in order to control the appropriate variables of the process. Keywords: virtual plant; engineering education; greenhouse; irrigation control Abdulrahman Alarifi, Mohammad Zarour, Noura Alomar, Ziyad Alshaikh, Mansour Alsaleh, SECDEP: Software engineering curricula development and evaluation process using SWEBOK, Information and Software Technology, Volume 74, June 2016, Pages 114-126, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2016.01.013. (https://www.sciencedirect.com/science/article/pii/S095058491630012X) Abstract: Abstract Context: Software engineering (SE) has a multidisciplinary and dynamic nature that makes it challenging to design its educational material. Guide to the software engineering body of knowledge (SWEBOK) which has evolved to become ISO/IEC 19759 standard has identified various knowledge areas to be part of any SE curricula. Although there is a number of studies that address the gap between SE curricula and software industry, the literature lacks defining a process that can be leveraged for continuously improving SE curricula to fulfill the software development market demands. Objective: In this paper, we propose a Software Engineering Curricula Development and Evaluation Process (SECDEP) that takes advantage of the SWEBOK guidelines to improve the quality of SE programs based on objective and subjective evidences. Method: Our process consists of multi-steps in which the local software market needs and the target SE program objectives and constraints are all taken into consideration. As a case study, we follow our process to investigate the core SE courses delivered as part of the SE curricula in a set of universities in our region. Results: The conducted case study identifies the factors that might contribute to mitigating the skills shortages in the target software market. We demonstrate the effectiveness of our process by identifying the weaknesses of the studied SE curricula and presenting recommendations to align the studied curricula with the demands of the target software market, which assists SE educators in the design and evaluation of their SE curricula. Conclusion: Based on the obtained results, the studied SE curricula can be enhanced by incorporating latest SE technologies, covering most of the SWEBOK knowledge areas, adopting SE curricula standards, and increasing the level of industrial involvement in SE curricula. We believe that achieving these enhancements by SE educators will have a positive impact on the SE curricula in question. Keywords: Software engineering education; Undergraduate curricula; SWEBOK; SE2004; SE2014; SWECOM Chung-Yang Chen, P. Pete Chong, Software engineering education: A study on conducting collaborative senior project development, Journal of Systems and Software, Volume 84, Issue 3, March 2011, Pages 479-491, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2010.10.042. (https://www.sciencedirect.com/science/article/pii/S0164121210002931) Abstract: Project and teamwork training is recognized as an important aspect in software engineering (SE) education. Senior projects, which often feature industrial involvement, serve the function of a ‘capstone course’ in SE curricula, by offering comprehensive training in collaborative software development. Given the characteristics of student team projects and the social aspects of software development, instructional issues in such a course must include: how to encourage teamwork, how to formalize and streamline stakeholder participation, and how to monitor students’ work, as well as sustain their desired collaborative effort throughout the development. In this paper, we present an exploratory study which highlights a particular case and introduces the meetings-flow approach. In order to investigate how this approach could contribute to the project's results, we examined its quantitative benefits in relation to the development of the project. We also conducted focus group interviews to discuss the humanistic findings and educational effects pertaining to this approach. Keywords: Software engineering education; Senior project; Collaborative development; Meetings-flow Art Pyster, Rick Adcock, Mark Ardis, Rob Cloutier, Devanandham Henry, Linda Laird, Harold ‘Bud’ Lawson, Michael Pennotti, Kevin Sullivan, Jon Wade, Exploring the Relationship between Systems Engineering and Software Engineering, Procedia Computer Science, Volume 44, 2015, Pages 708-717, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2015.03.016. (https://www.sciencedirect.com/science/article/pii/S1877050915002525) Abstract: Abstract In an effort to explore the relationship between the disciplines of systems engineering and software engineering, professionals from academia, industry, and government gathered for a workshop to deliberate on the current state, to acknowledge areas of inter-dependence, to identify relevant challenges, and to propose recommendations for addressing those challenges with respect to four topical areas: 1) Development Approaches, 2) Technical, 3) People, and 4) Education. This paper presents the deliberations and recommendations that emerged from that workshop, and the proposed project to be launched. Keywords: systems engineering; software engineering; development approaches; people; education Fairouz Tchier, Latifa Ben Arfa Rabai, Ali Mili, Putting engineering into software engineering: Upholding software engineering principles in the classroom, Computers in Human Behavior, Volume 48, July 2015, Pages 245-254, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2015.01.054. (https://www.sciencedirect.com/science/article/pii/S074756321500076X) Abstract: Abstract Ever since it emerged in the late (nineteen) sixties, the discipline of software engineering has set itself apart from other engineering disciplines in a number of ways, including: the pervasiveness of its products; the complexity of its products and processes; the criticality of its applications; the difficulty of managing its processes and estimating its costs; the volatility of its workforce; the intractability of its process lifecycles; etc. A number of principles have emerged from recent software engineering research, that have the potential to bring a measure of control to the practice of this discipline; but they have not made it into routine practice in industry. We argue that the classroom is a good place to start acquainting students with these principles, and to start getting them into the habit of adhering to them as a matter of routine practice. Keywords: Software engineering; Software engineering education; Software engineering principles; Classroom; Teaching practice Yuka Kato, An Education Program on a Network System Construction Process, IFAC Proceedings Volumes, Volume 42, Issue 24, 2010, Pages 296-301, ISSN 1474-6670, https://doi.org/10.3182/20091021-3-JP-2009.00056. (https://www.sciencedirect.com/science/article/pii/S1474667015316359) Abstract: Abstract This paper proposes an education program using e-Learning systems, which makes it possible for working students to obtain network system construction skills efficiently. The features of the program are: (i) it defines a network system construction process based on software development processes; (ii) it adopts a project-based learning style and tries for efficient remote team-learning; (iii) as for the construction process, it uses an iteration process with two cycles. In addition, the author conducts the proposed education program on the course in a graduate school, and confirms the effectiveness of the program. Keywords: computer networks; education; information technology; software engineering; systems design L. Canipel, H. Laroye, O. Juy, S. Mounier, M.-H. Petit, S. Franc, G. Charpentier, Protocol for interprofessional cooperation regarding medical telemonitoring of diabetes patients on insulin therapy, European Research in Telemedicine / La Recherche Européenne en Télémédecine, Volume 1, Issue 1, March 2012, Pages 32-39, ISSN 2212-764X, https://doi.org/10.1016/j.eurtel.2012.02.004. (https://www.sciencedirect.com/science/article/pii/S2212764X12000052) Abstract: Summary The number of doctors in metropolitan France is set to fall by around 20% by 2020, which will necessarily entail longer waiting times to see a specialist. Telemedicine ensures greater proximity between doctors and patients when the latter leave hospital. For this reason, we have created a cooperation protocol for bringing together a multidisciplinary team around an electronic personalised education programme (ePEP) for patient-monitoring using an electronic blood glucose diary. Given changes in the health system, the goal of cooperation under the terms of article 51 of the Hospital, patients, health, territories (HPST) law is to ensure access for patients to high-quality health care throughout the entire national territories. Skills regarding medical activities have been transferred to paramedical actors in numerous countries, in many cases beginning several years ago. The present diabetes treatment protocol involves patient monitoring using telemedicine (submitted on 2 December 2011). The protocol covers remote treatment of patients on insulin therapy by a multidisciplinary team. To this end, we have trained nurses specialised in the management of such patients, who carry an electronic blood glucose diary. These nurses perform medical acts outside their own area of expertise and normally undertaken by doctors; these are known as specially authorised acts. We have taken into consideration the requirements of both the specially authorised acts and of the telemedicine decree. We decided to insert an electronic version of the acts in the ePEP programme. This software, designed for training purposes, serves as a link between the various members of the multidisciplinary team. Doctors and health auxiliaries have access to patient files at all times. The cooperation protocol is currently undergoing evaluation in the ePEP trial, a multicentre feasibility study that will help define how healthcare is to be reorganised using the personalised education programme by: defining the role of nurses; assessing the benefits perceived by both caregivers and patients; evaluating the impact of ePEP on disease course after 6 months and in the longer term with regard to HbA1c levels, hypoglycaemic episodes, etc. Reorganisation of diabetes care is now well underway. Our management methods are being redesigned and formalised within a protocol in order to improve efficacy without incurring any corresponding increase in costs. We are convinced of the positive role to be played by telemedicine and the involvement of a multidisciplinary team in improving the management of diabetic patients on insulin therapy thanks to more up-to-date organisation of healthcare. We must now discuss the redistribution of costs with the healthcare authorities in order to ensure that such reorganisation is viable! Keywords: Co-operation protocol; Specially authorised acts; Nurse; Personalised education programme; Diabetes patient on insulinés; Protocole de coopération; Actes dérogatoires; Infirmier; Plan d’éducation personnalisé; Diabète insulinotraité Steffen Zschaler, Birgit Demuth, Lothar Schmitz, Salespoint: A Java framework for teaching object-oriented software development, Science of Computer Programming, Volume 79, 1 January 2014, Pages 189-203, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2012.04.005. (https://www.sciencedirect.com/science/article/pii/S016764231200069X) Abstract: Teaching systematic object-oriented software development to undergraduate students is difficult: Students need to develop a lot of complex skills. These include technical skills in object-oriented software development, but also social skills—for example, how to collaborate with other developers as part of a team working towards a large and complex software system. To acquire these skills, students need hands-on development experiences—for example, through team-oriented project courses. Designing such project courses is a challenge in itself: They must be both sufficiently challenging and achievable within the limited time available. In our special situation (large numbers of students supervised by small numbers of staff) an important further requirement is scalability: Different projects should be easily comparable while allowing for different tasks for different teams to reduce the risk of plagiarism. The solution that in our experience satisfies all these requirements is to use an application framework for an everyday application domain—for example, the business domain. Since 1997, we have been using Salespoint, a Java-based framework for creating business applications, that has been jointly developed and maintained in Dresden and Munich. In this paper, we briefly recollect the educational background and aims of the courses and present in some detail Salespoint (and its most recent revision, Salespoint2010): central notions like catalogs and stocks, the functionality it offers to users (application control, data management, and much more), a technical overview of its architecture, an example application built with Salespoint, and some lessons learned so far. Keywords: Programming education; Object-oriented framework; Business applications; Large-class project courses Stéphanie Coomans, Gilberto Santos Lacerda, PETESE, a Pedagogical Ergonomic Tool for Educational Software Evaluation, Procedia Manufacturing, Volume 3, 2015, Pages 5881-5888, ISSN 2351-9789, https://doi.org/10.1016/j.promfg.2015.07.895. (https://www.sciencedirect.com/science/article/pii/S2351978915008963) Abstract: Abstract Educational software are increasingly developing on the market these recent years. Evaluating their quality is required. Many authors have created evaluation checklists, but few join pedagogical and ergonomic aspects. Moreover, most of them aim helping teachers to select adequate software in their didactics. The present study proposes a tool for educational software of mathematics based on a discovery learning approach (PETESE), and aims highlighting the important development criteria of the software's design process before launching it on the market. The criteria are gathered in the field of education, mathematics and ergonomics, and analyzed through the anasynthesis methodology. PETESE is finally applied to a concrete case, the educative software of mathematic GGBook, a numeric book developed by the Abaco's lab (University of Brasilia) based on the GeoGebra environment and integrating facilities between the graphics and the operations elements. The results of this research show the importance of a specific referential in the creation process of a software of mathematics pointing to elements the software needs to focus better before entering the market. Keywords: PETESE; Educational software; predictive evaluation; pedagogical usability; mathematical software; discovery e-learning Carlos A. Jara, Francisco A. Candelas, Pablo Gil, Fernando Torres, Francisco Esquembre, Sebastián Dormido, EJS+EjsRL: An interactive tool for industrial robots simulation, Computer Vision and remote operation, Robotics and Autonomous Systems, Volume 59, Issue 6, June 2011, Pages 389-401, ISSN 0921-8890, https://doi.org/10.1016/j.robot.2011.02.002. (https://www.sciencedirect.com/science/article/pii/S0921889011000224) Abstract: This paper presents an interactive Java software platform which enables users to easily create advanced robotic applications together with Computer Vision processing. This novel tool is composed of two layers: (1) Easy Java Simulations (EJS), an open-source tool which provides support for creating applications with a full 2D/3D interactive graphical interface, and (2) EjsRL, a high-level Java library specifically designed for EJS which provides a complete functional framework for modeling and simulation of arbitrary serial-link manipulators, Computer Vision algorithms and remote operation. The combination of both components sets up a software architecture which contains a high number of functionalities in the same platform to develop complex simulations in Robotics and Computer Vision fields. In addition, the paper shows its successful application to virtual and remote laboratories, web-based resources that enhance the accessibility of experimental setups for education and research. Keywords: Modeling; Robot simulation; Robotics education; Visualization tools Ivano Gatto, Fabio Pittarello, Creating Web3D educational stories from crowdsourced annotations, Journal of Visual Languages & Computing, Volume 25, Issue 6, December 2014, Pages 808-817, ISSN 1045-926X, https://doi.org/10.1016/j.jvlc.2014.10.010. (https://www.sciencedirect.com/science/article/pii/S1045926X14001037) Abstract: Abstract 3D representation and storytelling are two powerful means for educating students while engaging them. This paper describes a novel software architecture that couples them for creating engaging linear narrations that can be shared on the web. The architecture takes advantage of a previous work focused on the semantic annotation of 3D worlds that allows the users to go beyond the simple navigation of 3D objects, permitting to retrieve them with different search tools. The novelty of our architecture is that authors don’t have to build stories from scratch, but can take advantage of the crowdsourced effort of all the users accessing the platform, which can contribute providing assets or annotating objects. At our best knowledge no existing workflow includes the collaborative annotation of 3D worlds and the possibility to create stories on the top of it. Another feature of our design is the possibility for users to switch from and to any of the available activities during the same session. This integration offers the possibility to define a complex user experience, even starting from a simple linear narration. The visual interfaces of the system will be described in relation to a case study focused on culture heritage. Keywords: Annotation; Education; Ontology; Storytelling; Tag; Web3D Zysko Grzegorz, Barza Radu, Klaus Schilling, Lei Ma, Frauke Driewer, Remote experiments on kinematics and control of mobile robots, IFAC Proceedings Volumes, Volume 37, Issue 8, July 2004, Pages 681-686, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)32057-8. (https://www.sciencedirect.com/science/article/pii/S1474667017320578) Abstract: Abstract This paper describes remote experiments with mobile robot hardware via Internet for engineering tele-education. It emphasizes the technical realisation aspects, but addresses in addition also the educational aims and the experiments' structure. The software architecture, sensor aspects and the autonomous functions for self test and protection will be addressed in detail. Modules for remote sensor data acquisition and tele-operations via Internet, as well as related educational units for self-studying have been implemented and evaluated. Key words: remote control; mobile robots; tele-operation; sensor systems; data acquisition; laboratory; education Ian Sommerville, Teaching cloud computing: A software engineering perspective, Journal of Systems and Software, Volume 86, Issue 9, September 2013, Pages 2330-2332, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2013.01.050. (https://www.sciencedirect.com/science/article/pii/S0164121213000198) Abstract: This article discusses the teaching of cloud computing in a software engineering course. It suggests that all courses should have some material introducing students to cloud computing, that practical teaching should focus on Platform as a Service and that there is scope for a graduate course in cloud software engineering covering map-reduce, schema-free databases, service-oriented computing, security and compliance and design for resilience. Keywords: Software engineering; Cloud computing; Education Christiane Gresse von Wangenheim, Rafael Savi, Adriano Ferreti Borgatto, DELIVER! – An educational game for teaching Earned Value Management in computing courses, Information and Software Technology, Volume 54, Issue 3, March 2012, Pages 286-298, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2011.10.005. (https://www.sciencedirect.com/science/article/pii/S095058491100214X) Abstract: Context To meet the growing need for education in Software Project Management, educational games have been introduced as a beneficial instructional strategy. However, there are no low-cost board games openly available to teach Earned Value Management (EVM) in computing programs. Objective This paper presents an educational board game to reinforce and teach the application of EVM concepts in the context of undergraduate computing programs complementing expository lessons on EVM basics. Method The game has been developed based on project management fundamentals and teaching experience in this area. So far, it has been applied in two project management courses in undergraduate computing programs at the Federal University of Santa Catarina. We evaluated motivation, user experience and the game’s contribution to learning through case studies on Kirkpatrick’s level one based on the perception of the students. Results First results of the evaluation of the game indicate a perceived potential of the game to contribute to the learning of EVM concepts and their application. The results also point out a very positive effect of the game on social interaction, engagement, immersion, attention and relevance to the course objectives. Conclusion We conclude that the game DELIVER! can contribute to the learning of the EVM on the cognitive levels of remembering, understanding and application. The illustration of the application of EVM through the game can motivate its usefulness. The game has proven to be an engaging instructional strategy, keeping students on the task and attentive. In this respect, the game offers a possibility to complement traditional instructional strategies for teaching EVM. In order to further generalize and to strengthen the validity of the results, it is important to obtain further evaluations. Keywords: Project management; Serious game; Teaching; Education; Earned Value Management; Computing Jason McC. Smith, The Pattern Instance Notation: A simple hierarchical visual notation for the dynamic visualization and comprehension of software patterns, Journal of Visual Languages & Computing, Volume 22, Issue 5, October 2011, Pages 355-374, ISSN 1045-926X, https://doi.org/10.1016/j.jvlc.2011.03.003. (https://www.sciencedirect.com/science/article/pii/S1045926X11000188) Abstract: Design patterns are a common tool for developers and architects to understand and reason about a software system. Visualization techniques for patterns tend to be either highly theoretical in nature or based on a structural view of a system's implementation. The Pattern Instance Notation is a simple notation technique for visualizing design patterns and other abstractions of software engineering. While based on a formal representation of design patterns, PIN is a tool for comprehension or reasoning which requires no formal training or study, and it is suitable for the programmer or designer without a theoretical background. PIN is hierarchical in nature and compactly encapsulates abstractions that may be spread widely across a system in a concise graphical format, while allowing for repeated unveiling of deeper layers of complexity and interaction on demand. It is designed to be used in either a dynamic visualization tool, or as a static representation for documentation and as a teaching aid. Keywords: Design patterns; Visualization; Education; Comprehension John Ogness, Klaus Schilling, Hubert Roth, A System To Facilitate Telematic Implementation, IFAC Proceedings Volumes, Volume 34, Issue 9, July 2001, Pages 85-90, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)41686-7. (https://www.sciencedirect.com/science/article/pii/S1474667017416867) Abstract: Abstract In the interest of promoting online robotics education, this paper presents a system to allow research institutions to easily make robots available on the internet. The system is based on Java and provides a simple, yet complete package to handle network communication, user management, serial device, and identification activities. By providing such an extensive functional base, telematic projects can be quickly implemented and made available while leaving developers to focus on more important issues. Keywords: Software engineering; Telematics; Systems design; Laboratory education; Communication systems Eric Ras, Jörg Rech, Using Wikis to support the Net Generation in improving knowledge acquisition in capstone projects, Journal of Systems and Software, Volume 82, Issue 4, April 2009, Pages 553-562, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.12.039. (https://www.sciencedirect.com/science/article/pii/S016412120800277X) Abstract: Students have to cope with new technologies, changing environments, and conflicting changes in capstone projects. They often lack practical experience, which might lead to failing to achieve a project’s learning goals. In addition, the Net Generation students put new requirements upon software engineering education because they are digitally literate, always connected to the Internet and their social networks. They react fast and multitask, prefer an experimental working approach, are communicative, and need personalized learning and working environments. Reusing experiences from other students provides a first step towards building up practical knowledge and implementing experiential learning in higher education. In order to further improve knowledge acquisition during experience reuse, we present an approach based on Web 2.0 technologies that generates so-called learning spaces. This approach automatically enriches experiences with additional learning content and contextual information. To evaluate our approach, we conducted a controlled experiment, which showed a statistically significant improvement for knowledge acquisition of 204% compared to conventional experience descriptions. From a technical perspective, the approach provides a good basis for future applications that support learning at the workplace in academia and industry for the Net Generation. Keywords: Net Generation; Software engineering education; Capstone project; Web 2.0; Learning space; Knowledge acquisition Barış Yüce, Adem Karahoca, Dilek Karahoca, The use of electronic curriculums in occupational education to evaluate and improve the cognitive capacity of candidate software engineers, Procedia Computer Science, Volume 3, 2011, Pages 1418-1424, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2011.01.024. (https://www.sciencedirect.com/science/article/pii/S1877050911000251) Abstract: This research is aimed to go beyond the traditional techniques of design and to apply an educational system. This system was designed to allow students to apply the theoretical knowledge and jargon of the System Analysis and Design course in Bahçeşehir University, Engineering Faculty usefully to other branches, in different situations. This course was taken by students in the fall term of 2008. The learning system which has been designed involves some electronic applications and tests to make the content more understandable and to improve student’s knowledge. At the end of term it was measured to what extent students were able to use the skills which they had learned from the course. Every week repeated case studies and other tools were employed to exercise students’ short-term memories. Their knowledge was gained through a combination of face-to-face sessions (information from the formal teaching methods and case studies) and online tools (jargon based word games and tests). Keywords: Blended learning; E-Learning; Critical Thinking; System Analysis; Learning Strategies Tim Dornan, Catherine Lee, Adam Stopford, Liam Hosie, Neil Maredia, Alan Rector, Rapid application design of an electronic clinical skills portfolio for undergraduate medical students, Computer Methods and Programs in Biomedicine, Volume 78, Issue 1, April 2005, Pages 25-33, ISSN 0169-2607, https://doi.org/10.1016/j.cmpb.2004.12.001. (https://www.sciencedirect.com/science/article/pii/S0169260704002287) Abstract: Summary The aim was to find how to use information and communication technology to present the clinical skills content of an undergraduate medical curriculum. Rapid application design was used to develop the product, and technical action research was used to evaluate the development process. A clinician–educator, two medical students, two computing science masters students, two other project workers, and a hospital education informatics lead, formed a design team. A sample of stakeholders took part in requirements planning workshops and continued to advise the team throughout the project. A university hospital had many features that favoured fast, inexpensive, and successful system development: a clearly defined and readily accessible user group; location of the development process close to end-users; fast, informal communication; leadership by highly motivated and senior end-users; devolved authority and lack of any rigidly imposed management structure; cooperation of clinicians because the project drew on their clinical expertise to achieve scholastic goals; a culture of learning and involvement of highly motivated students. A detailed specification was developed through storyboarding, use case diagramming, and evolutionary prototyping. A very usable working product was developed within weeks. “SkillsBase” is a database web application using Microsoft® Active Server Pages, served from a Microsoft® Windows 2000 Server operating system running Internet Information Server 5.0. Graphing functionality is provided by the KavaChart applet. It presents the skills curriculum, provides a password-protected portfolio function, and offers training materials. The curriculum can be presented in several different ways to help students reflect on their objectives and progress towards achieving them. The reflective portfolio function is entirely private to each student user and allows them to document their progress in attaining skills, as judged by self, peer and tutor assessment, and examinations. Training materials include web links and materials developed locally using pedagogic principles developed by the SkillsBase team. Although the usability of SkillsBase has been proven, uptake of software that has arisen ‘bottom-up’ from within the curriculum has proved slow. We plan to incorporate the SkillsBase services into a more comprehensive virtual managed learning environment, anticipating that presenting the functionality in an environment that is routinely used by students and teachers will increase uptake and use. Keywords: Computer-aided instruction; Clinical skills; Medical education; Software design; Rapid application development David Budgen, James E. Tomayko, The SEI curriculum modules and their influence: Norm Gibbs' legacy to software engineering education, Journal of Systems and Software, Volume 75, Issues 1–2, 15 February 2005, Pages 55-62, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.02.027. (https://www.sciencedirect.com/science/article/pii/S016412120400041X) Abstract: The Software Engineering Institute (SEI) at Carnegie Mellon University started its first contract with a carte blanche opportunity and generous funding to improve the state of software engineering education. Norm Gibbs, the first Director of Education at the SEI guided efforts in this area. One of his innovations, discussed here, were the “curriculum modules” encapsulating software engineering knowledge. We describe the scope and form of the curriculum modules, together with our personal experiences of developing the prototype modules. We conclude with an informal assessment of how well the original set of SEI curriculum modules match current ideas, both about software engineering education and also about the activities and practices that make up software engineering as a discipline. Keywords: Software engineering education; Curriculum modules; Software Engineering Institute, SEI Edgar E. Hassler, David P. Hale, Joanne E. Hale, A comparison of automated training-by-example selection algorithms for Evidence Based Software Engineering, Information and Software Technology, Volume 98, June 2018, Pages 59-73, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2018.02.001. (https://www.sciencedirect.com/science/article/pii/S0950584917302896) Abstract: AbstractContext Study search and selection is central to conducting Evidence Based Software Engineering (EBSE) research, including Systematic Literature Reviews and Systematic Mapping Studies. Thus, selecting relevant studies and excluding irrelevant studies, is critical. Prior research argues that study selection is subject to researcher bias, and the time required to review and select relevant articles is a target for optimization. Objective This research proposes two training-by-example classifiers that are computationally simple, do not require extensive training or tuning, ensure inclusion/exclusion consistency, and reduce researcher study selection time: one based on Vector Space Models (VSM), and a second based on Latent Semantic Analysis (LSA). Method Algorithm evaluation is accomplished through Monte-Carlo Cross-Validation simulations, in which study subsets are randomly chosen from the corpus for training, with the remainder classified by the algorithm. The classification results are then assessed for recall (a measure of completeness), precision (a measure of exactness) and researcher efficiency savings (reduced proportion of corpus studies requiring manual review as a result of algorithm use). A second smaller simulation is conducted for external validation. Results and conclusions VSM algorithms perform better in recall; LSA algorithms perform better in precision. Recall improves with larger training sets with a higher proportion of truly relevant studies. Precision improves with training sets with a higher portion of irrelevant studies, without a significant impact from the training set size. The algorithms reduce the influence of researcher bias and are found to significantly improve researcher efficiency. To improve recall, the findings recommend VSM and a large training set including as many truly relevant studies as possible. If precision and efficiency are most critical, the findings suggest LSA and a training set including a large proportion of truly irrelevant studies. Keywords: Research infrastructure; Evidence Based Software Engineering; Systematic Literature Review; Systematic Mapping Studies; Culling; VSM; LSA; Recall; Precision; Document selection Susan Ferreira, Misagh Faezipour, Advancing the Development of Systems Engineers Using Process Simulators, Procedia Computer Science, Volume 8, 2012, Pages 81-86, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2012.01.017. (https://www.sciencedirect.com/science/article/pii/S187705091200018X) Abstract: Simulators can serve as an efficient and effective means to educate and train systems engineering students as well as current systems engineers to understand complex aspects related to systems engineering. A systems engineering simulator can promote and accelerate experiential learning. Systems engineering simulators allow users to consider different project options and perform “what if” analyses in order to evaluate the dynamic consequences of decisions and understand the various related system dynamics without impact to the real system being simulated. Systems engineering “management flight” simulators can help individuals to develop the crucial systems thinking skills necessary for systems engineers and can reduce the time to develop an understanding of complex system dynamics. This paper discusses research related to the educational use of simulators in software engineering. These findings can be leveraged for the development of systems engineering process simulators. Keywords: Systems Engineering Simulator; System Dynamics; Systems Thinking; Systems Engineering Education Nancy R. Mead, Software engineering education: How far we’ve come and how far we have to go, Journal of Systems and Software, Volume 82, Issue 4, April 2009, Pages 571-575, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.12.038. (https://www.sciencedirect.com/science/article/pii/S0164121208002756) Abstract: In this paper I trace the history of software engineering education and focus on some of the key players. I highlight what has been accomplished in degree programs and curricula, conferences and working groups, professionalism, certification, and industry–university collaboration. I also look at the challenges that lie ahead—the global reach of education, new delivery mechanisms, new professional efforts, and the need to engage in leadership in software engineering education. What new approaches should be considered? How can we educators maintain our vitality? How can we best nurture new educators and encourage others to join our profession? Keywords: Software engineering education; Software engineering history Ricardo Valerdi, Ray Madachy, Impact and contributions of MBASE on software engineering graduate courses, Journal of Systems and Software, Volume 80, Issue 8, August 2007, Pages 1185-1190, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2006.09.051. (https://www.sciencedirect.com/science/article/pii/S0164121206002895) Abstract: As the founding Director of the Center for Software Engineering, Professor Barry Boehm developed courses that have greatly impacted the education of software engineering students. Through the use of the MBASE framework and complementary tools, students have been able to obtain real-life software development experience without leaving campus. Project team clients and the universities have also benefited. This paper provides evidence on the impact of Dr. Boehm’s frameworks on courses at two universities, and identifies major contributions to software engineering education and practice. Keywords: Barry Boehm; MBASE; COCOMO; Software engineering graduate courses; Software engineering education Ana M. Moreno, Maria-Isabel Sanchez-Segura, Fuensanta Medina-Dominguez, Laura Carvajal, Balancing software engineering education and industrial needs, Journal of Systems and Software, Volume 85, Issue 7, July 2012, Pages 1607-1620, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2012.01.060. (https://www.sciencedirect.com/science/article/pii/S0164121212000398) Abstract: In the world of information and communications technologies the demand for professionals with software engineering skills grows at an exponential rate. On this ground, we have conducted a study to help both academia and the software industry form a picture of the relationship between the competences of recent graduates of undergraduate and graduate software engineering programmes and the tasks that these professionals are to perform as part of their jobs in industry. Thanks to this study, academia will be able to observe which skills demanded by industry the software engineering curricula do or do not cater for, and industry will be able to ascertain which tasks a recent software engineering programme graduate is well qualified to perform. The study focuses on the software engineering knowledge guidelines provided in SE2004 and GSwE2009, and the job profiles identified by Career Space. Keywords: Software engineering education; Software engineering curricula; Software industry profiles; Software engineer competences; Software engineer skills Peter J. Bradley, Juan A. de la Puente, Juan Zamorano, Daniel Brosnan, A Platform for Real-Time Control Education with LEGO MINDSTORMS®1, IFAC Proceedings Volumes, Volume 45, Issue 11, 2012, Pages 112-117, ISSN 1474-6670, https://doi.org/10.3182/20120619-3-RU-2024.00062. (https://www.sciencedirect.com/science/article/pii/S1474667015375868) Abstract: Abstract A set of software development tools for building real-time control systems on a simple robotics platform is described in the paper. The tools are being used in a real-time systems course as a basis for student projects. The development platform is a low-cost PC running GNU/Linux, and the target system is LEGO MINDSTORMS NXT, thus keeping the cost of the laboratory low. Real-time control software is developed using a mixed paradigm. Functional code for control algorithms is automatically generated in C from Simulink models. This code is then integrated into a concurrent, real-time software architecture based on a set of components written in Ada. This approach enables the students to take advantage of the high-level, model-oriented features that Simulink offers for designing control algorithms, and the comprehensive support for concurrency and real-time constructs provided by Ada. Keywords: Control education; real-time systems; embedded systems; LEGO MINDSTORMS; Simulink; robot programming; Ada tasking programs Timothy C. Lethbridge, Priorities for the education and training of software engineers, Journal of Systems and Software, Volume 53, Issue 1, 15 July 2000, Pages 53-71, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(00)00009-1. (https://www.sciencedirect.com/science/article/pii/S0164121200000091) Abstract: We present the complete results of our 1998 survey of software practitioners. In this survey we asked over 200 software developers and managers from around the world what they thought about 75 educational topics. For each topic, we asked them how much they had learned about it in their formal education, how much they know about it now and how important the topic has been in their career. The objective of the survey was to provide data that can be used to improve the education and training of information technology workers. The results suggest that some widely taught topics perhaps should be taught less, while coverage of other topics should be increased. Keywords: Software engineering education; Computing education; Software engineering body of knowledge Ig Ibert Bittencourt, Evandro Costa, Marlos Silva, Elvys Soares, A computational model for developing semantic web-based educational systems, Knowledge-Based Systems, Volume 22, Issue 4, May 2009, Pages 302-315, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2009.02.012. (https://www.sciencedirect.com/science/article/pii/S0950705109000501) Abstract: Recently, some initiatives to start the so-called semantic web-based educational systems (SWBES) have emerged in the field of artificial intelligence in education (AIED). The main idea is to incorporate semantic web resources to the design of AIED systems aiming to update their architectures to provide more adaptability, robustness and richer learning environments. However, the construction of such systems is highly complex and faces several challenges in terms of software engineering and artificial intelligence aspects. This paper presents a computational model for developing SWBES focusing on the problem of how to make the development easier and more useful for both developers and authors. In order to illustrate the features of the proposed model, a case study is presented. Furthermore, a discussion about the results regarding the computational model construction is available. Keywords: Semantic web based educational systems; Agent-based software engineering; Artificial intelligence in education David Samper, Jorge Santolaria, Ana Cristina Majarena, Juan José Aguilar, Comprehensive simulation software for teaching camera calibration by a constructivist methodology, Measurement, Volume 43, Issue 5, June 2010, Pages 618-630, ISSN 0263-2241, https://doi.org/10.1016/j.measurement.2010.01.009. (https://www.sciencedirect.com/science/article/pii/S0263224110000151) Abstract: This paper describes the Metrovisionlab computer application implemented as a toolbox for the Matlab program. It is designed to teach the most important camera calibration aspects in dimensional metrology applications such as laser triangulation sensors and photogrammetry or stereovision systems. This software is used in several industrial vision courses for senior undergraduate mechanical engineering students. The application: simulates a virtual camera, providing a simple and visual understanding of how various characteristics of a camera influence the image that it captures; generates the coordinates of synthetic calibration points, both in the world reference system and the image reference system; and can calibrate with the most important and widely-used methods in the area of vision cameras, using coplanar or non-coplanar calibration points. Thus, the main goal is to have a simulation tool that allows characterizing the accuracy, repeatability, error mechanisms and influences for different measurement conditions and camera calibration algorithms. In the realized tests, the software has demonstrated to be a very effective educational tool. Keywords: Camera calibration; Software engineering education; Vision camera simulation Lars Bo Eriksen, Jan Stage, A qualitative empirical study of case tool support to method learning, Information and Software Technology, Volume 40, Issues 5–6, 30 July 1998, Pages 339-345, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00062-7. (https://www.sciencedirect.com/science/article/pii/S0950584998000627) Abstract: The appropriate time for introducing and using CASE tools in software engineering education and training is a matter of debate. Some argue in favor of early introduction with the aim of enforcing standardization and enhancing method learning. Others argue that CASE tool introduction should be postponed until a reasonable level of competence has been reached. This article discusses the relevance of CASE tools in the process of learning a software development method. The discussion is based on empirical results from a study that was conducted in a university setting. Keywords: Software engineering education; Method learning; CASE tool; Empirical study J.-Michael Bauschat, LEARNING ABOUT FLIGHT CONTROL USING AIRBORNE SIMULATION, IFAC Proceedings Volumes, Volume 39, Issue 6, 2006, Pages 389-396, ISSN 1474-6670, https://doi.org/10.3182/20060621-3-ES-2905.00068. (https://www.sciencedirect.com/science/article/pii/S1474667015331621) Abstract: Abstract The article deals with an educational concept in the field of experimental flight systems and applied flight control. Within the scope of a university teaching position at the Technical University of Berlin, the author provides students with theoretical information and practical experiences in flight control and flight-testing. The students come from different branches of aeronautical education and have very different previous knowledge. Within two semesters the students learn in the lectures about particular aspects of theoretical and applied flight control, aircraft dynamics, man/machine systems, real-time simulation, flight-testing, software engineering concepts, etc. In laboratory courses the students work with a modern computer based simulation and control system design tool. They learn how to analyze dynamic systems and how they can be influenced by feedforward and feedback controllers. In the last part of the second semester the designed controllers are implemented in the fly by wire system of DLRs flying test-bed ATTAS (Advanced Technologies Testing Aircraft System). The flight-tests are performed by DLR test pilots. After the two semesters the students have to write a report including all theoretical results and flight-test data evaluations and they have to pass an oral examination. Keywords: Flight control; Education; Modelling; Model-following control; Aircraft; Computer-aided control system design; Software engineering Nenad Stankovic, Single development project, Journal of Systems and Software, Volume 82, Issue 4, April 2009, Pages 576-582, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.12.046. (https://www.sciencedirect.com/science/article/pii/S0164121208002811) Abstract: An often-cited problem in undergraduate software engineering courses states that some topics are difficult to teach in a university setting and, although laboratory work is a useful supplement to the lectures, it is difficult to make projects realistic and relevant. In recognition of this problem, and based on our past experience, we started preparing a new course by examining the pedagogies and curricular aspects of software engineering that are important for the Net Generation of software engineers. The course project described in this paper concentrates on those aspects that can be dealt with effectively within the environment, i.e., the software lifecycle, system interdependences, teamwork, and realistic yet manageable project dynamics, all supported by various means of communication. The workload per students must be balanced with their lack of knowledge and skills, so that their unpreparedness to deal with complex issues does not abate their motivation. The approach was tested on six large projects over the period of one semester. We believe that the results reflect the students’ strong interest and commitment, and demonstrate their ability to stay focused and work at a level that is well above the obvious. Keywords: Education; Software engineering; Project VARGA Dániel, Kristóf CSORBA, Dávid SZALÓKI, Zoltán BECK, Gábor TEVESZ, Educational aspects of designing robot for Eurobot contest, IFAC Proceedings Volumes, Volume 45, Issue 11, 2012, Pages 342-347, ISSN 1474-6670, https://doi.org/10.3182/20120619-3-RU-2024.00074. (https://www.sciencedirect.com/science/article/pii/S1474667015376278) Abstract: Abstract The Eurobot contest is an international robot building competition. This paper presents an autonomous robot mechanical and electric design built to Eurobot 2011 robotics contest. The robot collects the playing elements, builds towers and avoid collision with the opponent robot. The building process, the control and vision algorithms are also presented. We specify the used absolute (ultrasonic sonar) and relative (odometry) positioning methods, and we describe their advantages and disadvantages. The hardware and software architecture, navigation algorithms are also presented. This paper focuses on the educational aspects of the project. We demonstrate how our team collaborates, and how the tasks were assigned to them in order to comply their thesis. Keywords: Autonomous mobile robots; Eurobot; Control education; Mechanical engineering; Positioning systems; Obstacle avoidance; Robot vision Cuauhtémoc López-Martín, Cornelio Yáñez-Márquez, Agustín Gutiérrez-Tornés, Predictive accuracy comparison of fuzzy models for software development effort of small programs, Journal of Systems and Software, Volume 81, Issue 6, June 2008, Pages 949-960, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.08.027. (https://www.sciencedirect.com/science/article/pii/S016412120700218X) Abstract: Regression analysis to generate predictive equations for software development effort estimation has recently been complemented by analyses using less common methods such as fuzzy logic models. On the other hand, unless engineers have the capabilities provided by personal training, they cannot properly support their teams or consistently and reliably produce quality products. In this paper, an investigation aimed to compare personal Fuzzy Logic Models (FLM) with a Linear Regression Model (LRM) is presented. The evaluation criteria were based mainly upon the magnitude of error relative to the estimate (MER) as well as to the mean of MER (MMER). One hundred five small programs were developed by thirty programmers. From these programs, three FLM were generated to estimate the effort in the development of twenty programs by seven programmers. Both the verification and validation of the models were made. Results show a slightly better predictive accuracy amongst FLM and LRM for estimating the development effort at personal level when small programs are developed. Keywords: Software engineering education; Software effort estimation; Fuzzy logic; Linear regression; Personal software process Christopher Peterson, Zenon Chaczko, Craig Scott, David Davis, SOFTWARE PROJECT MANAGEMENT FOR DEVELOPING COUNTRIES, IFAC Proceedings Volumes, Volume 38, Issue 1, 2005, Pages 35-40, ISSN 1474-6670, https://doi.org/10.3182/20050703-6-CZ-1902.02269. (https://www.sciencedirect.com/science/article/pii/S1474667016382817) Abstract: Abstract Software is developed and implemented by enterprises that wish to increase their efficiency and effectiveness. This process is often undertaken by persons who have little or no formal training in the field, particularly in developing countries. The results are frequently disadvantageous and often fatal to the enterprise. The University of Technology, Sydney has designed a special short postgraduate program targeted at persons in developing countries who have or wish to have such software responsibility. The response to this program has proven to be significant as it provides a fast and effective approach to increasing the software project management capability. Keywords: software engineering; computers; education; information technology; software project management Jean-Guy Schneider, Lorraine Johnston, eXtreme Programming––helpful or harmful in educating undergraduates?, Journal of Systems and Software, Volume 74, Issue 2, 15 January 2005, Pages 121-132, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2003.09.025. (https://www.sciencedirect.com/science/article/pii/S0164121203002929) Abstract: Criticism is sometimes leveled at the academic Software Engineering community on the basis that current educational practices are too document-centric. Both students and practitioners have suggested that one of the popular, lighter-weight, agile methods would be a better choice. This paper examines the educational goals for undergraduate Software Engineering education and considers how they might be met by the practices of eXtreme Programming. Our judgment is that education about some agile practices could be beneficial for small-scale development. However, as it stands now, eXtreme Programming as a package does not lend itself for use in educating about large-scale system development in tertiary education. Keywords: eXtreme programming; Agile methodologies; Tertiary education; Software engineering education Anne Morgan Spalter, Andries van Dam, Problems with using components in educational software, Computers & Graphics, Volume 27, Issue 3, June 2003, Pages 329-337, ISSN 0097-8493, https://doi.org/10.1016/S0097-8493(03)00027-X. (https://www.sciencedirect.com/science/article/pii/S009784930300027X) Abstract: Reuse is vital in the education world because the time and money necessary to create high-quality educational software is a significant problem. Estimates for the cost of creating a single well-designed, highly graphical and interactive online course in the commercial domain range from several hundred thousand dollars to a million or more. Thus, the idea of reusable software components that can be shared easily is tremendously appealing. In fact, “component” has become a buzzword in the educational software community, with millions of dollars from the National Science Foundation and other sponsors funding a wide variety of “component-based” projects. But few, if any, of these projects have approached the grand vision of creating repositories of easy-to-reuse components for developers and educators. This paper investigates some of the factors that stand in the way of achieving this goal. It also looks forward to a new genre of educational software that we hope will emerge when the basic components problems have been addressed. We begin by defining the word “component” and looking at several projects using components, with a focus on our Exploratories project at Brown University. We then discuss challenges in: critical mass, intellectual property issues, platform and system specificity, programming in the university environment, quality assurance, searching and metadata, and social issues. We look at relevant software engineering issues and describe why we believe educational applications have unique factors that should be considered when using components. Keywords: Components; Education; Educational software; Reuse Dieter Rombach, Jürgen Münch, Alexis Ocampo, Watts S. Humphrey, Dan Burton, Teaching disciplined software development, Journal of Systems and Software, Volume 81, Issue 5, May 2008, Pages 747-763, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.06.004. (https://www.sciencedirect.com/science/article/pii/S0164121207001434) Abstract: Discipline is an essential prerequisite for the development of large and complex software-intensive systems. However, discipline is also important on the level of individual development activities. A major challenge for teaching disciplined software development is to enable students to experience the benefits of discipline and to overcome the gap between real professional scenarios and scenarios used in software engineering university courses. Students often do not have the chance to internalize what disciplined software development means at both the individual and collaborative level. Therefore, students often feel overwhelmed by the complexity of disciplined development and later on tend to avoid applying the underlying principles. The Personal Software Process (PSP) and the Team Software Process (TSP) are tools designed to help software engineers control, manage, and improve the way they work at both the individual and collaborative level. Both tools have been considered effective means for introducing discipline into the conscience of professional developers. In this paper, we address the meaning of disciplined software development, its benefits, and the challenges of teaching it. We present a quantitative study that demonstrates the benefits of disciplined software development on the individual level and provides further experience and recommendations with PSP and TSP as teaching tools. Keywords: Software development; Productivity; Defect density; Size estimation; Effort estimation; Yield; Personal software process; Team software process; Experimental software engineering; Software engineering education David S. Janzen, Jungwoo Ryoo, Engaging the net generation with evidence-based software engineering through a community-driven web database, Journal of Systems and Software, Volume 82, Issue 4, April 2009, Pages 563-570, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.12.047. (https://www.sciencedirect.com/science/article/pii/S016412120800280X) Abstract: Software engineering faculty face the challenge of educating future researchers and industry practitioners regarding the generation of empirical software engineering studies and their use in evidence-based software engineering. In order to engage the Net generation with this topic, we propose development and population of a community-driven Web database containing summaries of empirical software engineering studies. We also present our experience with integrating these activities into a graduate software engineering course. These efforts resulted in the creation of “SEEDS: Software Engineering Evidence Database System”. Graduate students initially populated SEEDS with 216 summaries of empirical software engineering studies. The summaries were randomly sampled and reviewed by industry professionals who found the student-written summaries to be at least as useful as professional-written summaries. In fact, 30% more of the respondents found the student-written summaries to be “very useful”. Motivations, student and instructor-developed prototypes, and assessments of the resulting artifacts will be discussed. Keywords: Evidence-based software engineering; Empirical software engineering; Software engineering education E. Ersü, G. Kegel, Optical Proximity Sensor Systems for Intelligent Robot Hands, IFAC Proceedings Volumes, Volume 20, Issue 4, July 1987, Pages 191-196, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)55852-8. (https://www.sciencedirect.com/science/article/pii/S1474667017558528) Abstract: Abstract Optical proximity sensors are low-level vision sensors delivering low- and mid-range multidimensional information about the robots end-effector environment. Due to the fast signal processing they can be easily integrated in real-time robot control tasks. The paper presents the basic mechanical, hardware and software design principles of such a sensor, which uses distance measurement via optical triangulation as the basic method. For special robot tasks, special mechanical and hardware arrangements of the basic sensor type are needed. Two examples are shown for demonstration purposes. Possible applications are simple distance sensor devices, two-dimensional orientation sensors and optical robot teach-in units. Accuracy and efficiency of the sensor system are documented by using the sensor for recognizing holes and following arbitrary unknown contours. Keywords: Position measurement; computer peripheral equipement; optical distance sensors; robot teach-in Robert Wesley McGrew, Rayford B. Vaughn, Discovering vulnerabilities in control system human–machine interface software, Journal of Systems and Software, Volume 82, Issue 4, April 2009, Pages 583-589, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.12.049. (https://www.sciencedirect.com/science/article/pii/S0164121209000077) Abstract: As educators plan for curriculum enhancement and modifications to address the net-generation of software engineers, it will be important to communicate the necessity of considering software security engineering as applications are net-enabled. This paper presents a case study where commonly accepted software security engineering principles that have been published and employed for approximately 30 years, are not often seen in an important class of application software today. That class of software is commonly referred to as control system software or supervisory control and data acquisition (SCADA) software which is being used today within critical infrastructures and being net-enabled as it is modernized. This circumstance is driven by evolution and not intention. This paper details several vulnerabilities existing in a specific software application as a case study. These vulnerabilities are a result of not following widely-accepted secure software engineering practices which should have been considered by the software engineers developing the product studied. The applicability of these lessons to the classroom are also established with examples of how they are integrated into software engineering and computer science curricula. Keywords: SCADA; HMI; Security; Authentication; Education Nien-Lin Hsueh, Wen-Hsiang Shen, Zhi-Wei Yang, Don-Lin Yang, Applying UML and software simulation for process definition, verification, and validation, Information and Software Technology, Volume 50, Issues 9–10, August 2008, Pages 897-911, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2007.10.015. (https://www.sciencedirect.com/science/article/pii/S0950584907001231) Abstract: Process definition, verification, and validation are recognized as critical elements in software process improvement, whereas CMMI is a process improvement approach that provides organizations with the essential elements of effective processes. Organizations must define their own processes to meet the requirements of CMMI. A friendly, unambiguous process modeling language and tool are thus very important for organizations to define, verify, and validate the processes. Nevertheless, hardly has any research yet been done on how to embed CMMI process area goals into process definition stage to satisfy organization process improvement. In this research, we propose a UML-based approach to define, verify, and validate an organization’s process. Our approach can also be applied to a process learning environment for students and project members. Keywords: CMMI; UML; Process modeling; Process simulation; Software engineering education R.T Hughes, Using practitioners' problems to shape the content of a course on software project management, Annual Review in Automatic Programming, Volume 16, Part 2, 1992, Pages 143-152, ISSN 0066-4138, https://doi.org/10.1016/0066-4138(92)90023-I. (https://www.sciencedirect.com/science/article/pii/006641389290023I) Abstract: There is increasing interest in the United Kingdom in training and education in the techniques of Software Project Management. This paper discusses the problems with selecting topics for a course on this subject. It goes on to suggest how the content of such a course may be shaped by taking into account the problems that are encountered by Information Systems and Software Development practitioners in their day to day work. Keywords: Software Project Management; Information Systems; Higher Education; Course Design; Research Methods Ann E. Kelley Sobel, Applying an operational formal method throughout software engineering education, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 233-238, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00044-5. (https://www.sciencedirect.com/science/article/pii/S0950584998000445) Abstract: A strategy for integrating formal method application into several courses that are found in a software engineering curriculum is presented. A balance is made between teaching undergraduates material which is traditionally taught at the graduate level and a desire to strengthen both the mathematical basis of software engineering cation and the complex problem solving skills of our resulting software engineers. Keywords: Formal method; Software engineering education; Undergraduate curriculum Alex Baker, Emily Oh Navarro, André van der Hoek, An experimental card game for teaching software engineering processes, Journal of Systems and Software, Volume 75, Issues 1–2, 15 February 2005, Pages 3-16, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.02.033. (https://www.sciencedirect.com/science/article/pii/S0164121204000378) Abstract: The typical software engineering course consists of lectures in which concepts and theories are conveyed, along with a small “toy” software engineering project which attempts to give students the opportunity to put this knowledge into practice. Although both of these components are essential, neither one provides students with adequate practical knowledge regarding the process of software engineering. Namely, lectures allow only passive learning and projects are so constrained by the time and scope requirements of the academic environment that they cannot be large enough to exhibit many of the phenomena occurring in real-world software engineering processes. To address this problem, we have developed Problems and Programmers, an educational card game that simulates the software engineering process and is designed to teach those process issues that are not sufficiently highlighted by lectures and projects. We describe how the game is designed, the mechanics of its game play, and the results of an experiment we conducted involving students playing the game. Keywords: Software engineering education; Educational games; Software engineering simulation; Simulation games Éric Germain, Pierre N. Robillard, Towards software process patterns: An empirical analysis of the behavior of student teams, Information and Software Technology, Volume 50, Issue 11, October 2008, Pages 1088-1097, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2007.10.018. (https://www.sciencedirect.com/science/article/pii/S0950584907001267) Abstract: Traditional software engineering processes are composed of practices defined by roles, activities and artifacts. Software developers have their own understanding of practices and their own ways of implementing them, which could result in variations in software development practices. This paper presents an empirical study based on six teams of five students each, involving three different projects. Their process practices are monitored by time slips based on the effort expended on various process-related activities. This study introduces a new 3-pole graphical representation to represent the process patterns of effort expended on the various discipline activities. The purpose of this study is to quantify activity patterns in the actual process, which in turn demonstrates the variability of process performance. This empirical study provides three examples of patterns based on three empirical axes (engineering, coding and V&V). The idea behind this research is to make developers aware that there is wide variability in the actual process, and that process assessments might be weakly related to actual process activities. This study suggests that in-process monitoring is required to control the process activities. In-process monitoring is likely to provide causal information between the actual process activities and the quality of the implemented components. Keywords: Software engineering process; Process patterns; Process activities; Process monitoring; Effort; Empirical study; Software engineering education; Project control and modeling; Process measurement Dietmar Pfahl, Oliver Laitenberger, Günther Ruhe, Jörg Dorsch, Tatyana Krivobokova, Evaluating the learning effectiveness of using simulations in software project management education: results from a twice replicated experiment, Information and Software Technology, Volume 46, Issue 2, 1 February 2004, Pages 127-147, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(03)00115-0. (https://www.sciencedirect.com/science/article/pii/S0950584903001150) Abstract: The increasing demand for software project managers in industry requires strategies for the development of management-related knowledge and skills of the current and future software workforce. Although several educational approaches help to develop the necessary skills in a university setting, few empirical studies are currently available to characterise and compare their effects. This paper presents the results of a twice replicated experiment that evaluates the learning effectiveness of using a process simulation model for educating computer science students in software project management. While the experimental group applied a System Dynamics simulation model, the control group used the well-known COCOMO model as a predictive tool for project planning. The results of each empirical study indicate that students using the simulation model gain a better understanding about typical behaviour patterns of software development projects. The combination of the results from the initial experiment and the two replications with meta-analysis techniques corroborates this finding. Additional analysis shows that the observed effect can mainly be attributed to the use of the simulation model in combination with a web-based role-play scenario. This finding is strongly supported by information gathered from the debriefing questionnaires of subjects in the experimental group. They consistently rated the simulation-based role-play scenario as a very useful approach for learning about issues in software project management. Keywords: COCOMO; Learning effectiveness; Replicated experiment; Software project management education; System dynamics simulation Scott Schaefer, Joe Warren, Teaching computer game design and construction, Computer-Aided Design, Volume 36, Issue 14, December 2004, Pages 1501-1510, ISSN 0010-4485, https://doi.org/10.1016/j.cad.2003.10.006. (https://www.sciencedirect.com/science/article/pii/S0010448504000533) Abstract: Computer gaming is a key component of the rapidly growing entertainment industry. While building computer games has typically been a commercial endeavor, we believe that designing and constructing a computer game is also a useful activity for educating students about geometric modeling and computer graphics. In particular, students are exposed to the practical issues surrounding topics such as geometric modeling, rendering, collision detection, character animation and graphical design. Moreover, building an advanced game provides students exposure to the real-world side of software engineering that they are typically shielded from in the standard computer class. In this paper, we describe our experiences with teaching a computer science class that focuses on designing and building the best game possible in the course of a semester. The paper breaks down a typical game into various components that are suited for individual student projects and discusses the use of modern graphical design tools such as Maya in building art for the game. We conclude with a rough timeline for developing a game during the course of a semester and review some of the lessons learned during the three years we have taught the class. Keywords: Computer games; Education; Class project Mikael Svahnberg, Frans Mårtensson, Six years of evaluating software architectures in student projects, Journal of Systems and Software, Volume 80, Issue 11, November 2007, Pages 1893-1901, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.01.050. (https://www.sciencedirect.com/science/article/pii/S016412120700060X) Abstract: Software architecture evaluations are an important decision support tool when developing software systems. It is thus important that they are conducted professionally and that the results are of high quality. In order to improve the quality, it is necessary for the participants to gain experience in conducting software architecture evaluations. In this article we present guidelines based on six years of experience in software architecture evaluations. Although we primarily focus on our experiences on software architecture evaluation in student projects, we have also applied the same method in industry with similar experiences. Keywords: Software architecture; Software architecture evaluation; Education D. Leinhos, E. Schnieder, Impacts of Modelling Techniques to the Design of Train Localization, IFAC Proceedings Volumes, Volume 27, Issue 12, August 1994, Pages 995-1000, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)47604-X. (https://www.sciencedirect.com/science/article/pii/S147466701747604X) Abstract: Abstract To avoid an unstructured growth of a system architecture during the life cycle structured proceedings are required. An increase in project complexity leads to a strong demand of modelling techniques. This paper deals with the methods of modem system engineering like requirements engineering, system analysis and system design. The impacts of modelling techniques are illustrated by considering an advanced train localization system to modern railway operations control systems. The improvements and advantages of the technical integration of satellite navigation with an on-board measurement unit are discussed. Keywords: Functional analysis; modeling; models; pin pointing; railways; specification languages; system analysis; system documentation; train localization Kari Alho, Using the World Wide Web to assist software project course work, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 245-248, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00046-9. (https://www.sciencedirect.com/science/article/pii/S0950584998000469) Abstract: Project work is an important element in a software engineering curriculum. Administering student projects, however, can require lots of effort. With the introduction of Web-based course administration, we sought to improve both course management and dissemination of the results. This paper describes the utilization of Internet technologies on our Software Project course, and reports our experiences from over two year's time. Keywords: Software engineering education; World Wide Web; Software project; Computer-assisted education Claes Wohlin, Björn Regnell, Strategies for industrial relevance in software engineering education, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 125-134, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00085-0. (https://www.sciencedirect.com/science/article/pii/S0164121299000850) Abstract: This paper presents a collection of experiences related to success factors in graduate and postgraduate education. The experiences are mainly concerned with how to make the education relevant from an industrial viewpoint. This is emphasized as a key issue in software engineering education and research, as the main objective is to give the students a good basis for large-scale software development in an industrial environment. The presentation is divided into experiences at the graduate and postgraduate levels, respectively. For each level a number of strategies to achieve industrial relevance are presented. On the graduate level a course in large-scale software development is described to exemplify how industrial relevance can be achieved on the graduate level. The strategies on the postgraduate level have been successful, but it is concluded that more can be done regarding industrial collaboration in the planning and conduction of experiments and case studies. Another interesting strategy for the future is a special postgraduate program for people employed in industry. Keywords: Education; Software engineering; Industrial relevance; Large-scale software development; Technology transfer Klaus Schilling, Hendrik Heimer, Tele-Experiments Using Satellite Telecommunication Links Based on Multimedia Home Platform Standards, IFAC Proceedings Volumes, Volume 37, Issue 7, June 2004, Pages 71-76, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)32126-2. (https://www.sciencedirect.com/science/article/pii/S1474667017321262) Abstract: Abstract Satellites for multimedia broadcasting are of particular interest to reach in an economical way fixed and mobile users, distributed over large areas. For interactive applications, use of the Multimedia Home Platform standard offers interesting potential. Related hard-and software architecture as well as technology implementation aspects are analysed. At the example of tele-experiments with mobile robots typical application scenarios for remote sensor data acquisition and tele-control are addressed. The application potential of these technologies for industrial telemaintenance tasks is outlined. Key words: telematics; communication satellites; tele-education; telemaintenance PW Garratt, G Edmunds, Teaching software engineering at university, Information and Software Technology, Volume 30, Issue 1, January–February 1988, Pages 5-11, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(88)90100-0. (https://www.sciencedirect.com/science/article/pii/0950584988901000) Abstract: A software engineering course which is part of the computer science honours degree at Southampton University is described. The course has objectives and problems which are typical of such courses in an academic environment. Ideas are being adopted from software engineering courses in industry to extend and improve the education of students. Industrial training is also borrowing from the academic view of software engineering. There is now a strong link developing from increased technology transfer in this area and benefits are being noted. Keywords: software engineering; education; information systems C. Angelov, R.V.N. Melnik, J. Buur, The synergistic integration of mathematics, software engineering, and user-centred design: exploring new trends in education, Future Generation Computer Systems, Volume 19, Issue 8, November 2003, Pages 1299-1307, ISSN 0167-739X, https://doi.org/10.1016/S0167-739X(03)00088-8. (https://www.sciencedirect.com/science/article/pii/S0167739X03000888) Abstract: There is an increasing recognition in the society that interdisciplinary challenges must be part of new educational practices. In this paper, we describe the key curriculum activities at the University of Southern Denmark that combine mathematical modelling, software engineering, and user-centred design courses. These three disciplines represent a core of our graduate program, aiming at educating the professionals that will be capable of not only using but also further developing new technologies, and therefore, will be capable of fostering further the progress in computational science and engineering. Finally, we show how the learning environment, with emphases on broadening the student experience by industrial links, affects the student career aspiration. Keywords: Education; Computational science and engineering; Pervasive computing and mechatronics A.J. Cowling, The role of modelling in the software engineering curriculum, Journal of Systems and Software, Volume 75, Issues 1–2, 15 February 2005, Pages 41-53, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.02.021. (https://www.sciencedirect.com/science/article/pii/S0164121204000408) Abstract: This paper argues that the concept of modelling, and particularly the modelling of software system structures, is not being given sufficient attention within current sources that describe aspects of the software engineering curriculum. The paper describes the scope of modelling as a general concept, and explains the role that the modelling of software system structures plays within it. It discusses the treatment of this role within the various sources, and compares this both with the experience of the role that such modelling plays in the undergraduate curriculum at Sheffield University, and with the practice in other branches of engineering. The idea is examined that modelling should be treated as a recurring concept within the curriculum, and it is shown that this gives rise to a matrix structure for the software engineering curriculum. The paper discusses how such a structure can be mapped into a conventional hierarchical curriculum model, and the relationships that need to be made explicit in doing so. It describes the practical implications of these results for the structures of degree programmes in software engineering. Keywords: Software engineering education; Software modelling; Software engineering theory; Software engineering practice; Curriculum structure K. Jopke, R. Knigge, E. Schnieder, Functional Specification of Vital Computer Software for HIHG-Speed Maglev Systems, IFAC Proceedings Volumes, Volume 25, Issue 30, October 1992, Pages 123-128, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)49418-3. (https://www.sciencedirect.com/science/article/pii/S1474667017494183) Abstract: Abstract The system design method of Ward and Mellor is used to describe the functional behaviour of the control and protection system of the high-speed maglev train TRANSRAPID. This operation control system has to control (ATC) and to protect (ATP) many vehicles. A simulation has already been run successfully at Siemens in Braunschweig. A substantial problem is to obtain the proof of the ATP (Automatic Train Protection) of the whole system on the functional level already. In Germany, this proof with regard to safety has to be passed at the TÜV (Technical Supervision Association). The implementation model which contains the functional behaviour is the only object which can be tested against the program code by the experts. The experts who are verifying the system should get a clearly structured specification which allows him to understand the correct relations between the program code and the implementation model. In particular, the real time aspect with regard to the method of Ward and Mellor and guidelines made to interpret the system description will be considered. Keywords: Control system design; Describing functions; Real time computer systems; Safety proof; Software engineering; System analysis; Train control and protection K. Jopke, R. Knigge, E. Schnieder, Specification of Real-Time Systems for Protection Tasks in Automated High-Speed Transportation Systems, IFAC Proceedings Volumes, Volume 25, Issue 11, June 1992, Pages 67-74, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)50126-3. (https://www.sciencedirect.com/science/article/pii/S1474667017501263) Abstract: Abstract The system design method of Ward and Mellor is used to describe the functional behaviour of the control and protection system of the high-speed maglev train TRANSRAPID. This operation control system has to control (ATC) and to protect (ATP) many vehicles. A simulation has already been run successfully at Siemens in Braunschweig. A substantial problem is to obtain the proof of the ATP (Automatic Train Protection) of the whole system on the functional level already. In Germany, this proof with regard to safety has to be passed at the TÜV (Technical Supervision Association). The implementation model which contains the functional behaviour is the only object which can be tested against the program code by the experts. The experts who are verifying the system should get a clearly structured specification which allows him to understand the correct relations between the program code and the implementation model. In particular, the real time aspect with regard to the method of Ward and Mellor and guidelines made to interpret the system description will be considered. Keywords: Control system design; Describing functions; Real time computer systems; Safety proof; Software engineering; System analysis; Train control and protection P Brereton, S Lees, M Gumbley, C Boldyreff, S Drummond, P Layzell, L Macaulay, R Young, Distributed group working in software engineering education, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 221-227, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00042-1. (https://www.sciencedirect.com/science/article/pii/S0950584998000421) Abstract: Distributed group working of software engineering teams is increasingly evident in the `real world'. Tools to support such working are at present limited to general purpose groupware involving video, audio, chat, shared whiteboards and shared workspaces. Within software engineering education, group tasks have an established role in the curriculum. However, in general, groups are local to a particular university or institution and are composed of students who have a significant shared history (in terms of technical background and social interaction) and who are able to meet face-to-face on a regular basis. This paper reports on work undertaken by three UK universities to provide students with the opportunity to experience group working across multiple sites using low cost tools to support distributed cooperative working. Keywords: Software engineering education; Distributed group working; CSCW K. Fukuoka, Y. Suzuki, Y. Ueda, Y. Kawai, A Problem Oriented Approach to Reliable Train Tracking Software, IFAC Proceedings Volumes, Volume 14, Issue 2, August 1981, Pages 2401-2406, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)63827-8. (https://www.sciencedirect.com/science/article/pii/S1474667017638278) Abstract: Abstract A system of producing train tracking software is presented. The system, called SPRINT, facilitates software maintenance and enhancement by eliminating imperfect requirement specifications and simplifying software structures. SPRINT provides a language to describe requirements for train tracking software. The language consists of only railway terms so that it can be readily understood by anyone. SPRINT draws railway line diagrams from requirements described by the language. The diagrams are the same as original ones so that the requirements are easily examined. SPRINT transforms the requirements into train tracking software. Therefore, the software can be updated by only maintaining the requirements. Keywords: Train traffic control system; process control system; software reliability; software maintenance; software engineering K. Henning, M. Bruns, Design of an Automatic Control System for Train-to-Train Container Transfer, IFAC Proceedings Volumes, Volume 16, Issue 4, April 1983, Pages 165-173, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)62558-8. (https://www.sciencedirect.com/science/article/pii/S1474667017625588) Abstract: Abstract Quick transportation of containers by truck and train is one or the expanding markets of good transportation. In case of train transportation normally a container is entrained on a special waggon, which is part of a larger goods train. The basic idea of a new method of container transfer is the following: Containers shall change from one goods train into another as people change from one passenger train to another. This procedure is done by a special vehicle, which is able to transfer a container from one train to another under the trol1eywires. The train-to- train container transfer is supported by three computer systems, two micro-computer systems on the transfer vehicle and a process-computer for the operative control of the transfer process. The design of such a control system leads to a hierarchical control strategy, which describes five levels of control procedures and their interaction between one another as well as the man-machine interactions on some levels of the control problem. Several components of the transfer-system are almost finished, especially software procedures for the process control. Final results will be available in 1983, when the commercial train-to-train container transfer will be introduced by the German railway company “Deutsche Bundesbahn. Keywords: Hierarchical systems; transportation control; discrete systems; man-machine-systems; computer control; train control; rail-traffic; computer software R. Klar, U. Bayer, Computer-assisted teaching and learning in medicine, International Journal of Bio-Medical Computing, Volume 26, Issues 1–2, July 1990, Pages 7-27, ISSN 0020-7101, https://doi.org/10.1016/0020-7101(90)90016-N. (https://www.sciencedirect.com/science/article/pii/002071019090016N) Abstract: Induced mainly by the increased spreading of personal computers in the last few years computer-assisted instruction (CAI) systems for medicine have been developed on a large scale. Proven structure principles are above all the simulation of patient management in a problem-orientated approach, the mathematical simulation of (patho-) physiological functions independent of particular patients and the separation of educational mode and scoring mode. There exists already a large choice in programs dealing with topics of internal medicine — especially cardiology — while operative disciplines are less represented so far. Programs accredited in the US for continuing medical education (CME) are usually of high quality as to medical contents. Other important quality criteria to be mentioned concerning simulation programs are algorithms of medical decision making, completeness and refinement of the medical knowledge base, software design and user interface. CAI is a unique tool to enhance clinical problem solving skills although — of course — it can by no means replace bedside teaching. Keywords: Computer-assisted instruction; Simulation; Learning software; Medical education; Expert systems; Quality criteria Norman E. Gibbs, Software engineering and computer science: the impending split?, Education and Computing, Volume 7, Issues 1–2, 1991, Pages 111-117, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(05)80087-1. (https://www.sciencedirect.com/science/article/pii/S0167928705800871) Abstract: A tension is emerging in academia between those who view computing as a science and those who view computing as an emerging engineering discipline. By the year 2000, computing may separate into two academic disciplines, software engineering and computer science, paralleling the split of computer science from mathematics in the late 1960s and early 1970s. Educators must recognize the true needs of the practitioner community, or risk fragmenting and weakening the still new and rapidly evolving field of computing. Actions today will determine if computing will compete with or be subsumed by software engineering, rather than complement it. Keywords: computer science; industrial software development; software engineering; software engineering education Michael Godfrey, Teaching software engineering to a mixed audience, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 229-232, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00043-3. (https://www.sciencedirect.com/science/article/pii/S0950584998000433) Abstract: This paper describes some observations derived from teaching a course in software engineering to a mixed audience of undergraduates and professional Master's degree students at Cornell University. We describe our philosophical goals in teaching the course, some of the problems we encountered, some of the unexpected results, and what we intend to do differently next time. Keywords: Software engineering education; Mixed audiences; Undergraduate; Professional master's degree Donald J Bagert, A model for the software engineering component of a computer science curriculum, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 195-201, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00039-1. (https://www.sciencedirect.com/science/article/pii/S0950584998000391) Abstract: Software engineering is considered to be one of the major subareas of computer science. Therefore, regardless of what academic unit computer science is housed in, software engineering should play an essential role. Also, an increased interest in accrediting software engineering degree programs, and in licensing software engineers as professional engineers, has led to increased interest in software engineering by computer science programs. This paper examines how software engineering can be effectively integrated into a computer science curriculum. A model based on a four-course sequence is presented and discussed. This model is then compared to the software engineering component of the computer science undergraduate degree program at Texas Tech University, which implements most of the recommendations made. Finally, a discussion of how this model can be used in other types of computer science programs is included. Keywords: Software engineering education; Computer science education W.A. Halang, M. Witte, An Educational Platform for Variable-Structure Control of Safety-Related Systems1, IFAC Proceedings Volumes, Volume 27, Issue 3, June 1994, Pages 367-372, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)46137-4. (https://www.sciencedirect.com/science/article/pii/S1474667017461374) Abstract: Abstract A model of robust variable-structure controllers and their use in distributed systems for optimisation of sub-processes is presented. The observer concept is introduced as a basis for variable-structure control. A platform to transfer the formal model into applications is described. With emphasis on student education, this platform provides an easy to use environment for control software design, correctness proof, and implementation as well as for its validation in a realistic testbed (conveyor belt model). Key Words: Variable-structure control; safety-related systems; smart microcontroller; special-purpose architecture; education; software-implementation C. Richardson, L. Copeland, B. Wheeler, Obstacles to Shop-Floor CNC Programming in the United States, IFAC Proceedings Volumes, Volume 23, Issue 8, Part 6, August 1990, Pages 487-491, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)51469-X. (https://www.sciencedirect.com/science/article/pii/S147466701751469X) Abstract: Abstract The implementation of skill-based approaches to technological development face several obstacles in the United States. This paper explores three of these obstacles in the context of attempts to implement shop-floor programming of CNC machine tools: 1) Software design, 2) Orientation of training programs, and 3) Social issues of control. Discussion of these obstacles are presented in relation to the authors experiences in conducting the Shop-Floor Programming Project at the University of Lowell. Skill-based approaches which allow the voice of skilled production workers to influence the development of new technologies can lead to improved over-all working life. Solutions to over-come the obstacles associated with skill-based automation requires further investigation into the social issues underlying both the design and implementation of process technologies. Keywords: Computer Aided Manufacturing; Computer Numerical Control; Education; Numerical Control; Programming; Skill Based Automation; Software Development; Human Centered; Social Effects David Garlan, Making formal methods education effective for professional software engineers, Information and Software Technology, Volume 37, Issues 5–6, May–June 1995, Pages 261-268, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(95)99361-P. (https://www.sciencedirect.com/science/article/pii/095058499599361P) Abstract: A critical issue in the design of a professional software engineering degree program is the way in which formal methods are integrated into the curriculum. The approach taken by most programs is to teach formal techniques for software development in a separate course on formal methods. In this paper we detail some of the problems with that approach and describe an alternative in which formal methods are integrated across the curriculum. We illustrate the strengths and weaknesses of this alternative in terms of our experience of using it in the Master of Software Engineering Program at Carnegie Mellon University. Keywords: formal methods; software engineering education; masters programs; Z; MSE David Gries, Improving the curriculum through the teaching of calculation and discrimination, Education and Computing, Volume 7, Issues 1–2, 1991, Pages 61-72, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(05)80082-2. (https://www.sciencedirect.com/science/article/pii/S0167928705800822) Abstract: The field of computing—including its application by programmers, systems analysts and others—suffers tremendously from a lack of use of formal reasoning in everyday work. The problem can be solved, at least partly, by teaching calculational methods in mathematics and programming in the freshman year, and in such a way that students develop a skill with the methods, and not just a passing understanding. Freshman courses should instil the idea that mathematical techniques can help and should encourage a sense of discrimination, of making judgements on technical matters based on technical concerns. A serious overhaul of the conventional second programming course and the discrete mathematics course along these lines could have a profound impact on the whole computer science curriculum. Moreover, a properly designed course would be beneficial for mathematicians and engineers as well. Keywords: teaching logic; computer science education; programming methodology; software engineering Jochen Ludewig, Ralf Reißing, Teaching what they need instead of teaching what we like—the new software engineering curriculum at the University of Stuttgart, Information and Software Technology, Volume 40, Issue 4, 15 July 1998, Pages 239-244, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00045-7. (https://www.sciencedirect.com/science/article/pii/S0950584998000457) Abstract: At the University of Stuttgart, a new curriculum called Software Engineering was launched in October 1996. It is offered by the Department of Informatics, and supplements the standard informatics curriculum started in 1970. The curriculum is new not only for the University of Stuttgart, but for all German-speaking universities. This article first provides some background information about German universities in general. After this, the reasons for starting a new curriculum, its goals and its limitations are discussed. Finally, the current state and future steps are outlined. Keywords: Software engineering education; Software engineering curriculum; Practical approach J. Comer, T. Nute, D. Rodjak, CASAS: engineering software applications, Information and Software Technology, Volume 33, Issue 6, July–August 1991, Pages 443-450, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(91)90080-U. (https://www.sciencedirect.com/science/article/pii/095058499190080U) Abstract: CASAS is a generic computer-aided student advising system that was developed as a two-semester software engineering project for undergraduate computer-science majors and graduate students enrolled in the Master's of Software Design and Development (MSDD) program at Texas Christian University. The purpose of CASAS is to assist college and university faculty with its student advising responsibilities. The project management experiences gained in conducting the course during the 1988–1989 and 1989–1990 academic years are described. Keywords: software engineering; project management; software engineering education D. Glüer, G. Schmidt, Petri Net Models for Control of Manufacturing Systems - A Laboratory Experiment, IFAC Proceedings Volumes, Volume 27, Issue 9, August 1994, Pages 297-300, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)45954-4. (https://www.sciencedirect.com/science/article/pii/S1474667017459544) Abstract: Abstract For flexible manufacturing systems (FMS), Petri nets (PN) prove to be a convenient tool for modeling, design and analysis of automation tasks. To familiarise students with PN, with software engineering, and the systematic top-down approach of a control design, a small scale FMS is used. Starting with the production requirements, the performance of the software life cycle leads thus to a hierarchical control algorithm for the discrete process. The developed PN - based controller can be used both for push and pull production strategies and can be easily extended to include quality control tasks, fault recovery or multiple product manufacturing. Keywords: Control system design; Control system synthesis; Discrete systems; Education; Flexible manufacturing; Modeling; Petri nets; Production strategies; Software engineering Frank L. Friedman, The teaching and practice of software design concepts early in a CIS curriculum, Education and Computing, Volume 2, Issue 4, 1986, Pages 291-303, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(86)91465-2. (https://www.sciencedirect.com/science/article/pii/S0167928786914652) Abstract: The wealth of literature on software design issues has not yet been synthesized or organized for use beginning at the elementary levels of computing science curricula. Despite exposure to the literature and advanced level courses in software engineering topics, computer science education has failed to provide adequate tools and environments for repetitive practice of well known design and implementation techniques. Some reasons for this problem are outlined in this paper, and the necessity for urgent change is argued. A possible remedy for this situation is also described. A methodology for software design and implementation is outlined, and some simple tools that can provide a framework in which to practice this methodology are illustrated. The suggested technique is based upon the Jackson structured design process. It is shown that this technique can be applied equally as well to simple as well as moderately complex systems: to business systems as well as compilers, assemblers, and simulation systems. It is further suggested that the systems implemented using this technique naturally exhibit a number of properties now generally associated with good quality software. The described process is also shown to mirror abstraction-based approaches even down to the implementation level using a simple encapsulation unit in a Pascal-like pseudo code. Keywords: Computer Science Education; Software Engineering Tools and Techniques; Programming Methodologies; Structured Design; Abstraction-Based Design G. Pomberger, Software engineering education—adjusting our sails, Education and Computing, Volume 8, Issue 4, April 1993, Pages 287-294, ISSN 0167-9287, https://doi.org/10.1016/0167-9287(93)90381-A. (https://www.sciencedirect.com/science/article/pii/016792879390381A) Abstract: It is clear to everyone involved in the field of software engineering that our instruction lags well behind research. Conservative thinkers do not want to leap onto a given bandwagon too early, while even the most progressive thinkers face a profound challenge in incorporating worldwide research into the curriculum. Based on a discussion of current trends in software engineering, this presentation sketches the emphases that need to be placed in the software engineering curriculum. Keywords: software engineering curruculum; software engineering trends; software engineering education; computer science curruculum; computer science trends; computer science education Jean-Marie Flaus, Arlette Cheruy, François Lebourgeois, Odasys : A Software Tool to Help Knowledge and Experience Interaction Between University and Precess Industry, IFAC Proceedings Volumes, Volume 27, Issue 9, August 1994, Pages 221-223, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)45935-0. (https://www.sciencedirect.com/science/article/pii/S1474667017459350) Abstract: Abstract This paper presents an approach for improving the interaction between university and process industry in the field of process control. The main idea is to provide research results to the industry world, not as a paper but as a software tool that can be readily used on an industrial unit. As a matter of fact, the effort that is needed from the research world can be reduced with a suited software architecture, such as the one that we describe here. The functionalities of the current version of the ODASYS tool cover all the basic needs of the control enginneers, that is to say, identification, simulation, PID tuning and introduces some more recent approaches such as Internal Model Control. Keywords: Process control; Education; Industrial applications; Computer Aided Control and System Design Ben Livson, EduSET: educational software engineering tool, Information and Software Technology, Volume 30, Issue 4, May 1988, Pages 237-242, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(88)90084-5. (https://www.sciencedirect.com/science/article/pii/0950584988900845) Abstract: Computer science graduates do not necessarily make good software engineers. A partial solution to overcome this and other problems is by the educational software engineering tool EduSET project. The EduSET interface to software tool information and demonstration is specified and an outline of the architecture is given. Keywords: software engineering; education; development management system; software quality assurance; configuration management A.P. Ravn, Use of concurrent Pascal in systems programming teaching, Microprocessing and Microprogramming, Volume 10, Issue 1, August 1982, Pages 33-35, ISSN 0165-6074, https://doi.org/10.1016/0165-6074(82)90120-X. (https://www.sciencedirect.com/science/article/pii/016560748290120X) Abstract: Concurrent Pascal is a high level systems programming language designed by P. Brinch Hansen. A model implementation exists and has been used at the author's institute since early 1977. A project oriented systems programming course has been based on the implementation and the book “TThe Architecture of Concurrent Programs”, written by Brinch Hansen. The aims and contents of the courses, and their impact on participants abilities as system designers and programmers are described. Finally alternatives to Concurrent Pascal are discussed. Keywords: Education; Software Engineering; Pascal Ari Heiskanen, Jaana Helanterä, Towards better software management through careful analysis of current application systems, Information & Management, Volume 5, Issue 3, 1982, Pages 175-184, ISSN 0378-7206, https://doi.org/10.1016/0378-7206(82)90024-6. (https://www.sciencedirect.com/science/article/pii/0378720682900246) Abstract: Results are reported of empirical investigations into EDP applications in use. A model for connecting terms of the users of the applications to the consumed hardware resources is proposed. The properties of so-called “application profiling” are studied. Discussions are based on through analyses of a few EDP applications and one of them is abridged as a case study. Keywords: software engineering; software measurements; application profiling; efficiency of EDP applications; EDP education Ufuk Aydan, Murat Yilmaz, Paul M. Clarke, Rory V. O’Connor, Teaching ISO/IEC 12207 software lifecycle processes: A serious game approach, Computer Standards & Interfaces, Volume 54, Part 3, November 2017, Pages 129-138, ISSN 0920-5489, https://doi.org/10.1016/j.csi.2016.11.014. (https://www.sciencedirect.com/science/article/pii/S0920548916301969) Abstract: Abstract Serious games involve applying game design techniques to tasks of a serious nature. In particular, serious games can be used as informative tools and can be embedded in formal education. Although there are some studies related to the application of serious games for the software development process, there is no serious game that teaches the fundamentals of the ISO/IEC 12207:1995 Systems and software engineering – Software life cycle processes, which is an international standard for software lifecycle processes that aims to be ‘the’ standard that defines all the tasks required for developing and maintaining software. “Floors” is a serious game that proposes an interactive learning experience to introduce ISO/IEC 12207:1995 by creating different floors of a virtual environment where various processes of the standard are discussed and implemented. Inherently, it follows an iterative process based on interactive technical dialogues in a 3D computer simulated office. The tool is designed to assess the novice engineering practitioners knowledge and provide preliminary training for ISO/IEC 12207:1995 processes. By playing such a game, participants are able to learn about the details of this standard. The present study provides a framework for the exploration of research data obtained from computer engineering students. Results suggest that there is a significant difference between the knowledge gained among the students who have played Floors and those who have only participated in paper-based learning sessions. Our findings indicate that participants who played Floors tend to have greater knowledge of the ISO/IEC 12207:1995 standard, and as a result, we recommend the use of serious games that seem to be superior to traditional paper based approach. Keywords: ISO/IEC 12207; Serious games; Teaching standards Jerónimo Hernández-González, Daniel Rodriguez, Iñaki Inza, Rachel Harrison, Jose A. Lozano, Learning to classify software defects from crowds: A novel approach, Applied Soft Computing, Volume 62, January 2018, Pages 579-591, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2017.10.047. (https://www.sciencedirect.com/science/article/pii/S1568494617306610) Abstract: Abstract In software engineering, associating each reported defect with a category allows, among many other things, for the appropriate allocation of resources. Although this classification task can be automated using standard machine learning techniques, the categorization of defects for model training requires expert knowledge, which is not always available. To circumvent this dependency, we propose to apply the learning from crowds paradigm, where training categories are obtained from multiple non-expert annotators (and so may be incomplete, noisy or erroneous) and, dealing with this subjective class information, classifiers are efficiently learnt. To illustrate our proposal, we present two real applications of the IBM's orthogonal defect classification working on the issue tracking systems from two different real domains. Bayesian network classifiers learnt using two state-of-the-art methodologies from data labeled by a crowd of annotators are used to predict the category (impact) of reported software defects. The considered methodologies show enhanced performance regarding the straightforward solution (majority voting) according to different metrics. This shows the possibilities of using non-expert knowledge aggregation techniques when expert knowledge is unavailable. Keywords: Learning from crowds; Orthogonal defect classification; Missing ground truth; Bayesian network classifiers Chuanyi Li, Liguo Huang, Jidong Ge, Bin Luo, Vincent Ng, Automatically classifying user requests in crowdsourcing requirements engineering, Journal of Systems and Software, Volume 138, April 2018, Pages 108-123, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.12.028. (https://www.sciencedirect.com/science/article/pii/S0164121217303096) Abstract: Abstract In order to make a software project succeed, it is necessary to determine the requirements for systems and to document them in a suitable manner. Many ways for requirements elicitation have been discussed. One way is to gather requirements with crowdsourcing methods, which has been discussed for years and is called crowdsourcing requirements engineering. User requests forums in open source communities, where users can propose their expected features of a software product, are common examples of platforms for gathering requirements from the crowd. Requirements collected from these platforms are often informal text descriptions and we name them user requests. In order to transform user requests into structured software requirements, it is better to know the class of requirements that each request belongs to so that each request can be rewritten according to corresponding requirement templates. In this paper, we propose an effective classification methodology by employing both project-specific and non-project-specific keywords and machine learning algorithms. The proposed strategy does well in achieving high classification accuracy by using keywords as features, reducing considerable manual efforts in building machine learning based classifiers, and having stable performance in finding minority classes no matter how few instances they have. Keywords: Crowdsourcing requirements engineering; Software requirements classification; Machine learning; Natural language processing Evgeny V. Podryabinkin, Alexander V. Shapeev, Active learning of linearly parametrized interatomic potentials, Computational Materials Science, Volume 140, December 2017, Pages 171-180, ISSN 0927-0256, https://doi.org/10.1016/j.commatsci.2017.08.031. (https://www.sciencedirect.com/science/article/pii/S0927025617304536) Abstract: Abstract This paper introduces an active learning approach to the fitting of machine learning interatomic potentials. Our approach is based on the D-optimality criterion for selecting atomic configurations on which the potential is fitted. It is shown that the proposed active learning approach is highly efficient in training potentials on the fly, ensuring that no extrapolation is attempted and leading to a completely reliable atomistic simulation without any significant decrease in accuracy. We apply our approach to molecular dynamics and structure relaxation, and we argue that it can be applied, in principle, to any other type of atomistic simulation. The software, test cases, and examples of usage are published at http://gitlab.skoltech.ru/shapeev/mlip/. Keywords: Interatomic potential; Active learning; Learning on the fly; Machine learning; Atomistic simulation; Moment tensor potentials Makrina Viola Kosti, Kostas Georgiadis, Dimitrios A. Adamos, Nikos Laskaris, Diomidis Spinellis, Lefteris Angelis, Towards an affordable brain computer interface for the assessment of programmers’ mental workload, International Journal of Human-Computer Studies, Volume 115, July 2018, Pages 52-66, ISSN 1071-5819, https://doi.org/10.1016/j.ijhcs.2018.03.002. (https://www.sciencedirect.com/science/article/pii/S1071581918300934) Abstract: Abstract This paper provides a proof of concept for the use of wearable technology, and specifically wearable Electroencephalography (EEG), in the field of Empirical Software Engineering. Particularly, we investigated the brain activity of Software Engineers (SEngs) while performing two distinct but related mental tasks: understanding and inspecting code for syntax errors. By comparing the emerging EEG patterns of activity and neural synchrony, we identified brain signatures that are specific to code comprehension. Moreover, using the programmer's rating about the difficulty of each code snippet shown, we identified neural correlates of subjective difficulty during code comprehension. Finally, we attempted to build a model of subjective difficulty based on the recorded brainwave patterns. The reported results show promise towards novel alternatives to programmers’ training and education. Findings of this kind may eventually lead to various technical and methodological improvements in various aspects of software development like programming languages, building platforms for teams, and team working schemes. Keywords: Brainwaves; Wearable EEG; Neural synchrony; Human factor; Software engineering; Neuroergonomics André Pinheiro, Paulo Fernandes, Ana Maia, Gonçalo Cruz, Daniela Pedrosa, Benjamim Fonseca, Hugo Paredes, Paulo Martins, Leonel Morgado, Jorge Rafael, Development of a mechanical maintenance training simulator in OpenSimulator for F-16 aircraft engines, Entertainment Computing, Volume 5, Issue 4, December 2014, Pages 347-355, ISSN 1875-9521, https://doi.org/10.1016/j.entcom.2014.06.002. (https://www.sciencedirect.com/science/article/pii/S1875952114000184) Abstract: Abstract Mechanical maintenance of F-16 engines is carried out as a team effort involving 3–4 skilled engine technicians, but the details of its procedures and requisites change constantly, to improve safety, optimize resources, and respond to knowledge learned from field outcomes. This provides a challenge for development of training simulators, since simulated actions risk becoming obsolete rapidly and require costly reimplementation. This paper presents the development of a 3D mechanical maintenance training simulator for this context, using a low-cost simulation platform and a software architecture that separates simulation control from simulation visualization, in view of enabling more agile adaptation of simulators. This specific simulator aims to enable technician training to be enhanced with cooperation and context prior to the training phase with actual physical engines. We provide data in support of the feasibility of this approach, describing the requirements that were identified with the Portuguese Air Force, the overall software architecture of the system, the current stage of the prototype, and the outcomes of the first field tests with users. Keywords: Virtual worlds; OpenSimulator; Virtual learning; Cooperation; Task coordination; Aircraft engine maintenance André Pinheiro, Paulo Fernandes, Ana Maia, Gonçalo Cruz, Daniela Pedrosa, Benjamim Fonseca, Hugo Paredes, Paulo Martins, Leonel Morgado, Jorge Rafael, Development of a Mechanical Maintenance Training Simulator in OpenSimulator for F-16 Aircraft Engines, Procedia Computer Science, Volume 15, 2012, Pages 248-255, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2012.10.076. (https://www.sciencedirect.com/science/article/pii/S1877050912008381) Abstract: Mechanical maintenance of F-16 engines is carried out as a team effort involving 3 to 4 skilled engine technicians. This paper presents the development of a mechanical maintenance simulator for their training. This simulator aims to enable technician training to be enhanced with cooperation and context prior to the training phase with actual physical engines. We describe the requirements that were identified with the Portuguese Air Force, the overall software architecture of the system, the current stage of the prototype, and the outcomes of the first field tests with users. Keywords: Virtual worlds; OpenSimulator; Virtual learning; Cooperation; Task coordination; Aircraft engine maintenance Ying Ma, Guangchun Luo, Xue Zeng, Aiguo Chen, Transfer learning for cross-company software defect prediction, Information and Software Technology, Volume 54, Issue 3, March 2012, Pages 248-256, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2011.09.007. (https://www.sciencedirect.com/science/article/pii/S0950584911001996) Abstract: Context Software defect prediction studies usually built models using within-company data, but very few focused on the prediction models trained with cross-company data. It is difficult to employ these models which are built on the within-company data in practice, because of the lack of these local data repositories. Recently, transfer learning has attracted more and more attention for building classifier in target domain using the data from related source domain. It is very useful in cases when distributions of training and test instances differ, but is it appropriate for cross-company software defect prediction? Objective In this paper, we consider the cross-company defect prediction scenario where source and target data are drawn from different companies. In order to harness cross company data, we try to exploit the transfer learning method to build faster and highly effective prediction model. Method Unlike the prior works selecting training data which are similar from the test data, we proposed a novel algorithm called Transfer Naive Bayes (TNB), by using the information of all the proper features in training data. Our solution estimates the distribution of the test data, and transfers cross-company data information into the weights of the training data. On these weighted data, the defect prediction model is built. Results This article presents a theoretical analysis for the comparative methods, and shows the experiment results on the data sets from different organizations. It indicates that TNB is more accurate in terms of AUC (The area under the receiver operating characteristic curve), within less runtime than the state of the art methods. Conclusion It is concluded that when there are too few local training data to train good classifiers, the useful knowledge from different-distribution training data on feature level may help. We are optimistic that our transfer learning method can guide optimal resource allocation strategies, which may reduce software testing cost and increase effectiveness of software testing process. Keywords: Machine learning; Software defect prediction; Transfer learning; Naive Bayes; Different distribution Jinyong Wang, Ce Zhang, Software reliability prediction using a deep learning model based on the RNN encoder–decoder, Reliability Engineering & System Safety, Volume 170, February 2018, Pages 73-82, ISSN 0951-8320, https://doi.org/10.1016/j.ress.2017.10.019. (https://www.sciencedirect.com/science/article/pii/S0951832017303538) Abstract: Abstract Different software reliability models, such as parameter and non-parameter models, have been developed in the past four decades to assess software reliability in the software testing process. Although these models can effectively assess software reliability in certain testing scenarios, no single model can accurately predict the fault number in software in all testing conditions. In particular, modern software is developed with more sizes and functions, and assessing software reliability is a remarkably difficult task. The recently developed deep learning model, called deep neural network (NN) model, has suitable prediction performance. This deep learning model not only deepens the layer levels but can also adapt to capture the training characteristics. A comprehensive, in-depth study and feature excavation ultimately shows the model can have suitable prediction performance. This study utilizes a deep learning model based on the recurrent NN (RNN) encoder–decoder to predict the number of faults in software and assess software reliability. Experimental results show that the proposed model has better prediction performance compared with other parameter and NN models. Keywords: Deep learning model based on RNN encoder–decoder; Model comparison; Neural network models; Parameter models; Software reliability Maya Daneva, Striving for balance: A look at gameplay requirements of massively multiplayer online role-playing games, Journal of Systems and Software, Volume 134, December 2017, Pages 54-75, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2017.08.009. (https://www.sciencedirect.com/science/article/pii/S0164121217301644) Abstract: Abstract Engineering gameplay requirements is the most important task for game development organizations. Game industry discourse is concerned with continuous redesign of gameplay to enhance players' experience and boost game's appeal. However, accounts of gameplay requirements practices are rare. In responding to calls for more research into gameplay requirements engineering, we performed an exploratory study in the context of massively multiplayer online role-playing games (MMORPGs), from the perspective of practitioners involved in the field. Sixteen practitioners from three leading MMORPG-producing companies were interviewed and their gameplay requirements documents were reviewed. Interviewing and qualitative data analysis occurred in a cyclical process with results at each stage of the study informing decisions about data collection and analysis in the next. The analysis revealed a process of striving to reach a balance among three perspectives of gameplay requirements: a process perspective, an artifact perspective and a player-designer relationship perspective. This balance-driven process is co-created by game developers and players, is endless within the MMORPG, and is happening both in-game and off-game. It heavily relies on 'paper-prototyping' and play-testing for the purpose of gameplay requirements validation. The study concludes with discussion on validity threats and on implications for requirements engineering research, practice and education. Keywords: Requirements engineering; Gameplay requirements; Requirements elicitation; Qualitative interview-based study; Empirical research method Dylan G.M. Schouten, Rosie T. Paulissen, Marieke Hanekamp, Annemarie Groot, Mark A. Neerincx, Anita H.M. Cremers, Low-literates’ support needs for societal participation learning: Empirical grounding of theory- and model-based design, Cognitive Systems Research, Volume 45, October 2017, Pages 30-47, ISSN 1389-0417, https://doi.org/10.1016/j.cogsys.2017.04.007. (https://www.sciencedirect.com/science/article/pii/S1389041716301401) Abstract: Abstract Specialized learning support software can address the low societal participation of low-literate Dutch citizens. We use the situated Cognitive Engineering method to iteratively create a design specification for the envisioned system VESSEL: a Virtual Environment to Support the Societal participation Education of Low-literates. An initial high-level specification for this system is refined by incorporating the societal participation experiences of low-literate citizens into the design. In two series of user studies, the participant workshop and cultural probe methods were used with 23 low-literate participants. The Grounded Theory method was used to process the rich user data from these studies into the Societal Participation Experience of Low-Literates (SPELL) model. Using this experience model, the existing VESSEL specification was refined: requirements were empirically situated in the daily practice of low-literate societal participation, and new claims were written to explicate the learning effectiveness of the proposed VESSEL system. In conclusion, this study provides a comprehensive, theoretically and empirically grounded set of requirements and claims for the proposed VESSEL system, as well as the underlying SPELL model, which captures the societal participation experiences of low-literates citizens. The research methods used in this study are shown to be effective for requirements engineering with low-literate users. Keywords: Societal participation; Low-literacy; Virtual learning environment; Situated Cognitive Engineering; Requirements engineering; Qualitative methods Iván García-Magariño, Guillermo Palacios-Navarro, ATABS: A technique for automatically training agent-based simulators, Simulation Modelling Practice and Theory, Volume 66, August 2016, Pages 174-192, ISSN 1569-190X, https://doi.org/10.1016/j.simpat.2016.04.003. (https://www.sciencedirect.com/science/article/pii/S1569190X15301647) Abstract: Abstract The automatic training of agent-based simulators can be a complex task because of (a) their common nondeterministic behavior and (b) their complex relationships between their input parameters and the outputs. This work presents a technique called ATABS for automatically training agent-based simulators. This technique is based on a novel mechanism for generating random numbers that reduces the variability of the global results. This work provides a framework that automates this training by considering the relationships between the simulation parameters and the output features. This technique and framework have been applied to automatically train two different simulators. The current approach has been empirically compared with the most similar alternative. The results show that ATABS outperforms this alternative considering (1) the similarity between simulated and real data and (2) the execution time in the training process. The ATABS framework is publicly available. In this way, it ensures not only the reproducibility of the experiments, but also allows practitioners to apply the current approach to different agent-based simulators. Keywords: Agent-based simulator; Agent-oriented software engineering; Calibration; Random number generator; Multi-agent system; Variance reduction technique Terence Tang, Morgan E. Lim, Elizabeth Mansfield, Alexander McLachlan, Sherman D. Quan, Clinician user involvement in the real world: Designing an electronic tool to improve interprofessional communication and collaboration in a hospital setting, International Journal of Medical Informatics, Volume 110, February 2018, Pages 90-97, ISSN 1386-5056, https://doi.org/10.1016/j.ijmedinf.2017.11.011. (https://www.sciencedirect.com/science/article/pii/S1386505617304252) Abstract: AbstractObjectives User involvement is vital to the success of health information technology implementation. However, involving clinician users effectively and meaningfully in complex healthcare organizations remains challenging. The objective of this paper is to share our real-world experience of applying a variety of user involvement methods in the design and implementation of a clinical communication and collaboration platform aimed at facilitating care of complex hospitalized patients by an interprofessional team of clinicians. Methods We designed and implemented an electronic clinical communication and collaboration platform in a large community teaching hospital. The design team consisted of both technical and healthcare professionals. Agile software development methodology was used to facilitate rapid iterative design and user input. We involved clinician users at all stages of the development lifecycle using a variety of user-centered, user co-design, and participatory design methods. Results Thirty-six software releases were delivered over 24 months. User involvement has resulted in improvement in user interface design, identification of software defects, creation of new modules that facilitated workflow, and identification of necessary changes to the scope of the project early on. Conclusion A variety of user involvement methods were complementary and benefited the design and implementation of a complex health IT solution. Combining these methods with agile software development methodology can turn designs into functioning clinical system to support iterative improvement. Keywords: Software design; User involvement; Hospital communication system Kai Zheng, V.G. Vinod Vydiswaran, Yang Liu, Yue Wang, Amber Stubbs, Özlem Uzuner, Anupama E. Gururaj, Samuel Bayer, John Aberdeen, Anna Rumshisky, Serguei Pakhomov, Hongfang Liu, Hua Xu, Ease of adoption of clinical natural language processing software: An evaluation of five systems, Journal of Biomedical Informatics, Volume 58, Supplement, December 2015, Pages S189-S196, ISSN 1532-0464, https://doi.org/10.1016/j.jbi.2015.07.008. (https://www.sciencedirect.com/science/article/pii/S1532046415001483) Abstract: AbstractObjective In recognition of potential barriers that may inhibit the widespread adoption of biomedical software, the 2014 i2b2 Challenge introduced a special track, Track 3 – Software Usability Assessment, in order to develop a better understanding of the adoption issues that might be associated with the state-of-the-art clinical NLP systems. This paper reports the ease of adoption assessment methods we developed for this track, and the results of evaluating five clinical NLP system submissions. Materials and methods A team of human evaluators performed a series of scripted adoptability test tasks with each of the participating systems. The evaluation team consisted of four “expert evaluators” with training in computer science, and eight “end user evaluators” with mixed backgrounds in medicine, nursing, pharmacy, and health informatics. We assessed how easy it is to adopt the submitted systems along the following three dimensions: communication effectiveness (i.e., how effective a system is in communicating its designed objectives to intended audience), effort required to install, and effort required to use. We used a formal software usability testing tool, TURF, to record the evaluators’ interactions with the systems and ‘think-aloud’ data revealing their thought processes when installing and using the systems and when resolving unexpected issues. Results Overall, the ease of adoption ratings that the five systems received are unsatisfactory. Installation of some of the systems proved to be rather difficult, and some systems failed to adequately communicate their designed objectives to intended adopters. Further, the average ratings provided by the end user evaluators on ease of use and ease of interpreting output are −0.35 and −0.53, respectively, indicating that this group of users generally deemed the systems extremely difficult to work with. While the ratings provided by the expert evaluators are higher, 0.6 and 0.45, respectively, these ratings are still low indicating that they also experienced considerable struggles. Discussion The results of the Track 3 evaluation show that the adoptability of the five participating clinical NLP systems has a great margin for improvement. Remedy strategies suggested by the evaluators included (1) more detailed and operation system specific use instructions; (2) provision of more pertinent onscreen feedback for easier diagnosis of problems; (3) including screen walk-throughs in use instructions so users know what to expect and what might have gone wrong; (4) avoiding jargon and acronyms in materials intended for end users; and (5) packaging prerequisites required within software distributions so that prospective adopters of the software do not have to obtain each of the third-party components on their own. Keywords: Usability; Human–computer interaction; User-computer interface [L01.224.900.910]; Software design [L01.224.900.820]; Software validation [L01.224.900.868]; Natural language processing [L01.224.065.580] Fuqun HUANG, Bin LIU, Software defect prevention based on human error theories, Chinese Journal of Aeronautics, Volume 30, Issue 3, June 2017, Pages 1054-1070, ISSN 1000-9361, https://doi.org/10.1016/j.cja.2017.03.005. (https://www.sciencedirect.com/science/article/pii/S1000936117300778) Abstract: Abstract Software defect prevention is an important way to reduce the defect introduction rate. As the primary cause of software defects, human error can be the key to understanding and preventing software defects. This paper proposes a defect prevention approach based on human error mechanisms: DPeHE. The approach includes both knowledge and regulation training in human error prevention. Knowledge training provides programmers with explicit knowledge on why programmers commit errors, what kinds of errors tend to be committed under different circumstances, and how these errors can be prevented. Regulation training further helps programmers to promote the awareness and ability to prevent human errors through practice. The practice is facilitated by a problem solving checklist and a root cause identification checklist. This paper provides a systematic framework that integrates knowledge across disciplines, e.g., cognitive science, software psychology and software engineering to defend against human errors in software development. Furthermore, we applied this approach in an international company at CMM Level 5 and a software development institution at CMM Level 1 in the Chinese Aviation Industry. The application cases show that the approach is feasible and effective in promoting developers’ ability to prevent software defects, independent of process maturity levels. Keywords: Human factor; Human error; Programming; Root cause analysis; Software defect prevention; Software design; Software quality; Software psychology Philipp Last, Martin Kroker, Lars Linsen, Generating real-time objects for a bridge ship-handling simulator based on automatic identification system data, Simulation Modelling Practice and Theory, Volume 72, March 2017, Pages 69-87, ISSN 1569-190X, https://doi.org/10.1016/j.simpat.2016.12.011. (https://www.sciencedirect.com/science/article/pii/S1569190X16302830) Abstract: Abstract Most accidents at sea are caused due to decision errors made by crew members. Hence, nautical education plays an important role. A common approach for training crew members is the usage of ship handling simulators. Our paper aims at increasing the closeness to real-world scenarios of simulator-based education for nautical personnel by integrating real objects into the simulation process. This integration aims at improving the learning experience leading to higher safety on sea. Since the introduction of the Automatic Identification System (AIS), which has to be installed on professional operating vessels, vessel movements can be tracked. Thus, we are using AIS data for the data integration process. Within this context several practical problems are addressed which arise in the design of a software architecture which uses live AIS data. This includes the availability of specific AIS data attributes, the AIS reporting intervals, and the mapping of AIS data to the Distributed Interactive Simulation (DIS) interface. Presented results have been successfully implemented in a software architecture which integrates a live AIS data stream into the simulation process of a full mission bridge simulator. Keywords: Maritime safety; AIS; DIS; Distributed simulation Yuan-Hsin Tung, Shian-Shyong Tseng, A novel approach to collaborative testing in a crowdsourcing environment, Journal of Systems and Software, Volume 86, Issue 8, August 2013, Pages 2143-2153, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2013.03.079. (https://www.sciencedirect.com/science/article/pii/S0164121213000782) Abstract: Abstract Software testing processes are generally labor-intensive and often involve substantial collaboration among testers, developers, and even users. However, considerable human resource capacity exists on the Internet in social networks, expert communities, or internet forums—referred to as crowds. Effectively using crowd resources to support collaborative testing is an interesting and challenging topic. This paper defines the collaborative testing problem in a crowd environment as an NP-Complete job assignment problem and formulates it as an integer linear programming (ILP) problem. Although package tools can be used to obtain the optimal solution to an ILP problem, computational complexity makes these tools unsuitable for solving large-scale problems. This study uses a greedy approach with four heuristic strategies to solve the problem. This is called the crowdsourcing-based collaborative testing approach. This approach includes two phases, training phase and testing phase. The training phase transforms the original problem into an ILP problem. The testing phase solves the ILP using heuristic strategies. A prototype system, called the Collaborative Testing System (COTS), is also implemented. The experiment results show that the proposed heuristic algorithms produce good quality approximate solutions in an acceptable timeframe. Keywords: Crowdsourcing; Cloud computing; Software testing; Collaborative testing; Integer linear programming Boriss Misnevs, Software Engineering Competence Evaluation Portal, Procedia Computer Science, Volume 43, 2015, Pages 11-17, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2014.12.003. (https://www.sciencedirect.com/science/article/pii/S1877050914015713) Abstract: Abstract This paper presents the results of the initial research phase of Software Engineering Competence Evaluation Portal design and implementation. The web portal will be dedicated to joint master program training content and supervision synchronization between several Baltic universities. The functionality of the portal will provide a common support service for learning outcomes information exchange, referring to a graduate's knowledge, skills and competence upon completion of the Master of Science in Software Engineering (Information Technology) Programs. Keywords: Competance evaluation; Web portal; Knowledge mapping; Testing; SWEBOK Yaning Wu, Song Huang, Haijin Ji, Changyou Zheng, Chengzu Bai, A novel Bayes defect predictor based on information diffusion function, Knowledge-Based Systems, Volume 144, 15 March 2018, Pages 1-8, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2017.12.015. (https://www.sciencedirect.com/science/article/pii/S0950705117305877) Abstract: Abstract Software defect prediction plays a significant part in identifying the most defect-prone modules before software testing. Quite a number of researchers have made great efforts to improve prediction accuracy. However, the problem of insufficient historical data available for within- or cross- project still remains unresolved. Further, it is common practice to use the probability density function for a normal distribution in Naïve Bayes (NB) classifier. Nevertheless, after performing a Kolmogorov–Smirnov test, we find that the 21 main software metrics are not normally distributed at the 5% significance level. Therefore, this paper proposes a new Bayes classifier, which evolves NB classifier with non-normal information diffusion function, to help solve the problem of lacking appropriate training data for new projects. We conduct three experiments on 34 data sets obtained from 10 open source projects, using only 10%, 6.67%, 5%, 3.33% and 2% of the total data for training, respectively. Four well-known classification algorithms are also included for comparison, namely Logistic Regression, Naïve Bayes, Random Tree and Support Vector Machine. All experimental results demonstrate the efficiency and practicability of the new classifier. Keywords: Cross-project defect prediction; Naïve Bayes; Information diffusion function; Software metrics Sun Hong-mei, Jia Rui-sheng, Research on Case Teaching of Software Development Comprehensive Practice Based on Project Driven, Procedia Engineering, Volume 29, 2012, Pages 484-488, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2011.12.747. (https://www.sciencedirect.com/science/article/pii/S1877705811065842) Abstract: In order to improve the software engineering students’ software development hands-on ability, innovation ability and employment competitive, to improve the students’ integrated application knowledge ability, according to the strong practicality characteristics of software engineering project, researched the case teaching method and its application based on project driven in software development comprehensive practice. Discussed the content setting of the practice training, teaching form and process arrangement, combined with the .net technology training, detailed analyzed the implementation of the case teaching based on project driven, introduced the content design of case project and its class management. Practice proved that the project training based on case teaching, deepened students’ understanding of project development process, promoted their ability of flexible using of knowledge to solve practical problems, enhanced their actual programming ability, students’ potential energy and innovation ability get promoted continuously. Keywords: innovation ability; software development; case teaching; comprehensive practice; project driven J. Alex, U. Jumar, U. Bitter, On-line simulation of wastewater treatment plants, IFAC Proceedings Volumes, Volume 32, Issue 2, July 1999, Pages 5794-5799, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)56989-X. (https://www.sciencedirect.com/science/article/pii/S147466701756989X) Abstract: Abstract A software system to support the operation of municipal or industrial wastewater treatment plants is introduced. The demands on and solutions for this system, with respect to control and observer design, software implementation as well as the operator interface, are worked out. Results of a project to develop a model based operation support system for wastewater treatment plants are presented. Possible applications to train operators, to design software sensors and to enable prognosis are explained. The test case - a full scale WWTP - is introduced. Finally, a prototype of a technical environment for on-line simulations and remote operation is presented. Keywords: Process control; Model-based control; Water pollution; Observers; Simulation; Software engineering; Remote Control Maricel Medina, Lance Sherry, Michael Feary, Automation for task analysis of next generation air traffic management systems, Transportation Research Part C: Emerging Technologies, Volume 18, Issue 6, December 2010, Pages 921-929, ISSN 0968-090X, https://doi.org/10.1016/j.trc.2010.03.006. (https://www.sciencedirect.com/science/article/pii/S0968090X1000032X) Abstract: The increasing span of control of Air Traffic Control enterprise automation (e.g. Flight Schedule Monitor, Departure Flow Management), along with lean-processes and pay-for-performance business models, has placed increased emphasis on operator training time and error rates. There are two traditional approaches to the design of human–computer interaction (HCI) to minimize training time and reduce error rates: (1) experimental user testing provides the most accurate assessment of training time and error rates, but occurs too late in the development cycle and is cost prohibitive, (2) manual review methods (e.g. cognitive walkthrough) can be used earlier in the development cycle, but suffer from poor accuracy and poor inter-rater reliability. Recent development of “affordable” human performance models provide the basis for the automation of task analysis and HCI design to obtain low cost, accurate, estimates of training time and error rates early in the development cycle. This paper describes a usability/HCI analysis tool that this intended for use by design engineers in the course of their software engineering duties. The tool computes estimates of trials-to-mastery (i.e. time to competence for training) and the probability of failure-to-complete for each task. The HCI required to complete a task on the automation under development is entered into the web-based tool via a form. Assessments of the salience of visual cues to prompt operator actions for the proposed design are used to compute training time and error rates. The web-based tool enables designers in multiple locations to review and contribute to the design. An example analysis is provided along with a discussion of the limitations of the tool and directions for future research. Keywords: Human–computer interaction; Usability analysis; Task analysis; Probability of failure-to-complete a task; Trials-to-mastery Wang Xishi, Ning Bin, Liu Yun, Xu Min, Development of a Safe Control System for Train Operation and Research on its Computer Simulation System, IFAC Proceedings Volumes, Volume 29, Issue 1, June–July 1996, Pages 7686-7691, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)58927-2. (https://www.sciencedirect.com/science/article/pii/S1474667017589272) Abstract: Abstract Research on a computer simulation system has been conducted in order to develop a new kind of automatic train protection (ATP) system for Chinese Railway. The paper describes software design of the simulation system which has the same mathematic model of train braking with the ATP system and consists of more than 20 modules According to the data provided by simulation result, a new kind of ATP system was developed and on-line simulation was conducted. In the paper, the features of the ATP system is also introduced as well. With multi-microprocessor structure, the new kind of the ATP system adopts speed-distance mode curve control. Service braking can be implemented both for passenger trains and freight trains in the ATP system. The ATP system will become the leading safe control system of train operation on the main lines for Chinese Railway Keywords: Rail Traffic; Safety; Computer simulation; Transportation control; Automatic process control Amritanshu Agrawal, Wei Fu, Tim Menzies, What is wrong with topic modeling? And how to fix it using search-based software engineering, Information and Software Technology, Volume 98, June 2018, Pages 74-88, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2018.02.005. (https://www.sciencedirect.com/science/article/pii/S0950584917300861) Abstract: AbstractContext Topic modeling finds human-readable structures in unstructured textual data. A widely used topic modeling technique is Latent Dirichlet allocation. When running on different datasets, LDA suffers from “order effects”, i.e., different topics are generated if the order of training data is shuffled. Such order effects introduce a systematic error for any study. This error can relate to misleading results; specifically, inaccurate topic descriptions and a reduction in the efficacy of text mining classification results. Objective To provide a method in which distributions generated by LDA are more stable and can be used for further analysis. Method We use LDADE, a search-based software engineering tool which uses Differential Evolution (DE) to tune the LDA’s parameters. LDADE is evaluated on data from a programmer information exchange site (Stackoverflow), title and abstract text of thousands of Software Engineering (SE) papers, and software defect reports from NASA. Results were collected across different implementations of LDA (Python+Scikit-Learn, Scala+Spark) across Linux platform and for different kinds of LDAs (VEM, Gibbs sampling). Results were scored via topic stability and text mining classification accuracy. Results In all treatments: (i) standard LDA exhibits very large topic instability; (ii) LDADE’s tunings dramatically reduce cluster instability; (iii) LDADE also leads to improved performances for supervised as well as unsupervised learning. Conclusion Due to topic instability, using standard LDA with its “off-the-shelf” settings should now be depreciated. Also, in future, we should require SE papers that use LDA to test and (if needed) mitigate LDA topic instability. Finally, LDADE is a candidate technology for effectively and efficiently reducing that instability. Keywords: Topic modeling; Stability; LDA; Tuning; Differential evolution Mary Elizabeth “M.E.” Jones, Il-Yeol Song, Dimensional modeling: Identification, classification, and evaluation of patterns, Decision Support Systems, Volume 45, Issue 1, April 2008, Pages 59-76, ISSN 0167-9236, https://doi.org/10.1016/j.dss.2006.12.004. (https://www.sciencedirect.com/science/article/pii/S0167923606002089) Abstract: Software design is a complex activity. A successful designer requires knowledge and training in specific design techniques combined with practical experience. Designing a dimensional model embodies this challenge. This paper presents Dimensional Design Patterns (DDPs) and their application to the design of dimensional models. We describe a metamodel of the DDPs and show their integration into Kimball's dimensional modeling design process so they can be applied to design problems using a known practice. By providing a metamodel and a method for DDP use, we combine theory and a practical design technique with the goal of increasing the efficiency and effectiveness of the software designer. The experimental results show that the classroom use of DDPs increase the effectiveness by 25% and efficiency by 9% for students in designing dimensional models. This research shows that DDPs could be an effective tool not only for teaching a dimensional model in academia, but also for designing dimensional models in an industry setting. Keywords: Data warehouse; Dimensional modeling; Data models; Design; Patterns; Software engineering Vahid Garousi, Ahmet Coşkunçay, Aysu Betin-Can, Onur Demirörs, A survey of software engineering practices in Turkey, Journal of Systems and Software, Volume 108, October 2015, Pages 148-177, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2015.06.036. (https://www.sciencedirect.com/science/article/pii/S0164121215001314) Abstract: Abstract Understanding the types of software engineering (SE) practices and techniques used in industry is important. There is a wide spectrum in terms of the types and maturity of SE practices conducted in industry. Turkey has a vibrant software industry and it is important to characterize and understand the state of its SE practices. Our objective is to characterize and grasp a high-level view on type of SE practices in the Turkish software industry. To achieve this objective, we systematically designed an online survey with 46 questions based on our past experience in the Canadian and Turkish contexts and using the Software Engineering Body of Knowledge (SWEBOK). Two hundred and two practicing software engineers from the Turkish software industry participated in the survey. The survey results reveal important and interesting findings about SE practices in Turkey and beyond. They also help track the profession of SE, and suggest areas for improved training, education and research. Among the findings are the followings: (1) The military and defense software sectors are quite prominent in Turkey, especially in the capital Ankara region, and many SE practitioners work for those companies. (2) 54% of the participants reported not using any software size measurement methods, while 33% mentioned that they have measured lines of code (LOC). (3) In terms of effort, after the development phase (on average, 31% of overall project effort), software testing, requirements, design and maintenance phases come next and have similar average values (14%, 12%, 12% and 11% respectively). (4) Respondents experience the most challenge in the requirements phase. (5) Waterfall, as a rather old but still widely used lifecycle model, is the model that more than half of the respondents (53%) use. The next most preferred lifecycle models are incremental and Agile/lean development models with usage rates of 38% and 34%, respectively. (6) The Waterfall and Agile methodologies have slight negative correlations, denoting that if one is used in a company, the other will less likely to be used. The results of our survey will be of interest to SE professionals both in Turkey and world-wide. It will also benefit researchers in observing the latest trends in SE industry identifying the areas of strength and weakness, which would then hopefully encourage further industry–academia collaborations in those areas. Keywords: Software engineering; Industry practices; Turkey D. Rodríguez-Gracia, J. Criado, L. Iribarne, N. Padilla, A collaborative testbed web tool for learning model transformation in software engineering education, Computers in Human Behavior, Volume 51, Part B, October 2015, Pages 734-741, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2014.11.096. (https://www.sciencedirect.com/science/article/pii/S0747563214007158) Abstract: Abstract Software Engineering provides mechanisms to design, develop, manage and maintain social and collaborative software systems. At present, the Software Engineering Curricula includes teaching Model-Driven Engineering (MDE) as a new paradigm that enables higher productivity, attempting to maximize compatibility between systems. Modern learning methods MDE require the use of practical approaches to analyze new model-transformation techniques. Model transformations are carried out by using very high-level languages, like the ATL language. This model transformation language is built as a plugin for the Eclipse framework, and users who want to collaborate and develop software with it, have some difficulties executing ATL transformations outside this platform. To handle models at runtime, it is interesting to perform the transformations in a standalone way. In this context, we have developed a testbed web tool which aims to be useful for learning model transformation techniques. The tool offers a Graphical User Interface to test and verify the involved model transformations. The proposal is useful as a collaborative scenario for learning MDE and model transformation issues and techniques in Software Engineering education. Keywords: MDE; Model transformation; M2M; ATL; EMF; Learning tool Shahid Hussain, Jacky Keung, Arif Ali Khan, Software design patterns classification and selection using text categorization approach, Applied Soft Computing, Volume 58, September 2017, Pages 225-244, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2017.04.043. (https://www.sciencedirect.com/science/article/pii/S1568494617302259) Abstract: AbstractContext Numerous software design patterns have been introduced and cataloged either as a canonical or a variant solution to solve a design problem. The existing automatic techniques for design pattern(s) selection aid novice software developers to select the more appropriate design pattern(s) from the list of applicable patterns to solve a design problem in the designing phase of software development life cycle. Goal However, the existing automatic techniques are limited to the semi-formal specification, multi-class problem, an adequate sample size to make precise learning and individual classifier training in order to determine a candidate design pattern class and suggest more appropriate pattern(s). Method To address these issues, we exploit a text categorization based approach via Fuzzy c-means (unsupervised learning technique) that targets to present a systematic way to group the similar design patterns and suggest the appropriate design pattern(s) to developers related to the specification of a given design problem. We also propose an evaluation model to assess the effectiveness of the proposed approach in the context of several real design problems and design pattern collections. Subsequently, we also propose a new feature selection method Ensemble-IG to overcome the multi-class problem and improve the classification performance of the proposed approach. Results The promising experimental results suggest the applicability of the proposed approach in the domain of classification and selection of appropriate design patterns. Subsequently, we also observed the significant improvement in learning precision of the proposed approach through Ensemble-IG. Conclusion The proposed approach has four advantages as compared to previous work. First, the semi-formal specification of design patterns is not required as a prerequisite; second, the ground reality of class label assignment is not mandatory; third, lack of classifier’s training for each design pattern class and fourth, an adequate sample size is not required to make precise learning. Keywords: Deign patterns; Text categorization; Supervised learning; Unsupervised learning; Feature selection; Design problems Mohammad Adnan Rajib, Venkatesh Merwade, I Luk Kim, Lan Zhao, Carol Song, Shandian Zhe, SWATShare – A web platform for collaborative research and education through online sharing, simulation and visualization of SWAT models, Environmental Modelling & Software, Volume 75, January 2016, Pages 498-512, ISSN 1364-8152, https://doi.org/10.1016/j.envsoft.2015.10.032. (https://www.sciencedirect.com/science/article/pii/S1364815215300906) Abstract: Abstract Hydrologic models for a particular watershed or a region are created for addressing a specific research or management problem, and most of the models do not get reused after the project is completed. Similarly, multiple models may exist for a particular geographic location from different researchers or organizations. To avoid the duplication of efforts, and enable model reuse and enhancement through collaborative efforts, a prototype cyberinfrastructure, called SWATShare, is developed for sharing, execution and visualization of Soil and Water Assessment Tool (SWAT). The objective of this paper is to present the software architecture, functional capabilities and implementation of SWATShare as a collaborative environment for hydrology research and education using the models published and shared in the system. Besides the capability of publishing, sharing, discovery and downloading of SWAT models, some of the functions in SWATShare such as model calibration are supported by providing access to high performance computing resources including the XSEDE and cloud. Additionally, SWATShare can create dynamic spatial and temporal plots of model outputs at different scales. SWATShare can also be used as an educational tool within a classroom setting for comparing the hydrologic processes under different geographic and climatic settings. The utility of SWATShare for collaborative research and education is demonstrated by using three case studies. Even though this paper focuses on the SWAT model, the system’s architecture can be replicated for other models as well. Keywords: Cyberinfrastructure; WaterHUB; SWAT; SWATShare; XSEDE; Hydrology N Papaspyrou, S Retalis, S Efremidis, G Barlas, E Skordalakis, Web-based teaching in software engineering, Advances in Engineering Software, Volume 30, Issue 12, December 1999, Pages 901-906, ISSN 0965-9978, https://doi.org/10.1016/S0965-9978(99)00016-2. (https://www.sciencedirect.com/science/article/pii/S0965997899000162) Abstract: The introduction of the new technologies of computer networks and hypermedia systems in education seems promising. However, only through experimentation can the effectiveness of these technologies be demonstrated. This was the main objective of the EONT project, in the process of which the National Technical University of Athens adapted an introductory course in Software Engineering to a novel enriched instructional delivery mode. The existing course material was supplemented by Web-based courseware, integrated into a Web-based novel networked learning environment. In this article we report on the results of our research and development, concerning this particular course, and discuss the results that were obtained from our evaluation study. Keywords: Software engineering; Web-based courseware; Enriched instructional delivery mode; Novel learning environment Jun Liao, Meng Joo Er, Jianya Lin, Application of a system for the automatic generation of fuzzy neural networks, Engineering Applications of Artificial Intelligence, Volume 13, Issue 3, 1 June 2000, Pages 293-302, ISSN 0952-1976, https://doi.org/10.1016/S0952-1976(99)00060-3. (https://www.sciencedirect.com/science/article/pii/S0952197699000603) Abstract: To facilitate the transfer of technology emerging from theoretical research of fuzzy neural networks into industrial applications, a fuzzy neural networks system for the automatic generation (FNNAGS) is proposed in this paper. In FNNAGS, the fuzzy model constructed by the system can be expressed as either a Mamdani model or a Takagi–Sugeno model, according to the preference of the user. Off-line design and on-line applications are incorporated into an interactive software system. In the stage of off-line design, only the training data need to be provided in order to construct a process model. Users do not need to give the initial fuzzy partitions, membership functions or fuzzy logic rules. These initial parameters will be set up automatically by the FNNAGS, in accordance with the properties of the training data. After off-line design has been completed, the model can be expressed as a fuzzy rule base, which can be used to control, estimate, identify or predict a process or plant through an application interface between FNNAGS and the external world. Keywords: Fuzzy neural networks; Software design; Takagi–Sugeno fuzzy model; Mamdani fuzzy model Sarah Manzoor, Raza Ul Islam, Aayman Khalid, Abdul Samad, Jamshed Iqbal, An open-source multi-DOF articulated robotic educational platform for autonomous object manipulation, Robotics and Computer-Integrated Manufacturing, Volume 30, Issue 3, June 2014, Pages 351-362, ISSN 0736-5845, https://doi.org/10.1016/j.rcim.2013.11.003. (https://www.sciencedirect.com/science/article/pii/S0736584513001002) Abstract: Abstract This research presents an autonomous robotic framework for academic, vocational and training purpose. The platform is centred on a 6 Degree Of Freedom (DOF) serial robotic arm. The kinematic and dynamic models of the robot have been derived to facilitate controller design. An on-board camera to scan the arm workspace permits autonomous applications development. The sensory system consists of position feedback from each joint of the robot and a force sensor mounted at the arm gripper. External devices can be interfaced with the platform through digital and analog I/O ports of the robot controller. To enhance the learning outcome for beginners, higher level commands have been provided. Advanced users can tailor the platform by exploiting the open-source custom-developed hardware and software architectures. The efficacy of the proposed platform has been demonstrated by implementing two experiments; autonomous sorting of objects and controller design. The proposed platform finds its potential to teach technical courses (like Robotics, Control, Electronics, Image-processing and Computer vision) and to implement and validate advanced algorithms for object manipulation and grasping, trajectory generation, path planning, etc. It can also be employed in an industrial environment to test various strategies prior to their execution on actual manipulators. Keywords: Educational robotic platform; Industrial robots; Manipulator arm; Robot vision; Autonomous system Boriss Misnevs, Ugur Demiray, The Role of Communication and Meta-communication in Software Engineering with Relation to Human Errors, Procedia Engineering, Volume 178, 2017, Pages 213-222, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2017.01.100. (https://www.sciencedirect.com/science/article/pii/S1877705817301005) Abstract: Abstract This paper examines and focuses on some issues and questions relating to how the use meta-communication concept in Software Engineering process to reduce human errors. The role of IT project communication and the project management tools, which can be regarded as vital for Software Engineering are investigated. Socio-cognitive modeling of Integrated Software Engineering using the TOGA meta-theory, has been discussed. Today the focus is especially on the identification of human and organization decisional errors caused by software developers and managers under high-risk conditions, as evident by analyzing reports on failed IT projects. Software Engineer's communication skills are listed. Several types of initial communication situations in decision-making useful for the diagnosis of Software developers’ errors are considered. The developed models can be used for training the IT project management executive staff. Keywords: defect prevention; Socio-cognitive modeling; IT project processes; TOGA meta-theory Lin Chen, Bin Fang, Zhaowei Shang, Yuanyan Tang, Negative samples reduction in cross-company software defects prediction, Information and Software Technology, Volume 62, June 2015, Pages 67-77, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2015.01.014. (https://www.sciencedirect.com/science/article/pii/S0950584915000348) Abstract: AbstractContext Software defect prediction has been widely studied based on various machine-learning algorithms. Previous studies usually focus on within-company defects prediction (WCDP), but lack of training data in the early stages of software testing limits the efficiency of WCDP in practice. Thus, recent research has largely examined the cross-company defects prediction (CCDP) as an alternative solution. Objective However, the gap of different distributions between cross-company (CC) data and within-company (WC) data usually makes it difficult to build a high-quality CCDP model. In this paper, a novel algorithm named Double Transfer Boosting (DTB) is introduced to narrow this gap and improve the performance of CCDP by reducing negative samples in CC data. Method The proposed DTB model integrates two levels of data transfer: first, the data gravitation method reshapes the whole distribution of CC data to fit WC data. Second, the transfer boosting method employs a small ratio of labeled WC data to eliminate negative instances in CC data. Results The empirical evaluation was conducted based on 15 publicly available datasets. CCDP experiment results indicated that the proposed model achieved better overall performance than compared CCDP models. DTB was also compared to WCDP in two different situations. Statistical analysis suggested that DTB performed significantly better than WCDP models trained by limited samples and produced comparable results to WCDP with sufficient training data. Conclusions DTB reforms the distribution of CC data from different levels to improve the performance of CCDP, and experimental results and analysis demonstrate that it could be an effective model for early software defects detection. Keywords: Cross-company defects prediction; Software fault prediction; Transfer learning Li-Han Chen, Fu-Hau Hsu, Yanling Hwang, Mu-Chun Su, Wei-Shinn Ku, Chi-Hsuan Chang, ARMORY: An automatic security testing tool for buffer overflow defect detection, Computers & Electrical Engineering, Volume 39, Issue 7, October 2013, Pages 2233-2242, ISSN 0045-7906, https://doi.org/10.1016/j.compeleceng.2012.07.005. (https://www.sciencedirect.com/science/article/pii/S0045790612001309) Abstract: Abstract Program Buffer Overflow Defects (PBODs) are the stepping stones of Buffer Overflow Attacks (BOAs), which are one of the most dangerous security threats to the Internet. In this paper, we propose a kernel-based security testing tool, named ARMORY, for software engineers to detect PBODs automatically when they apply all kinds of testing, especially functional testing and unit testing, without increasing the testing workload. Besides, ARMORY does not need any attack instance, any training phase, or source code to finish its security testing. ARMORY can detect unknown PBODs. ARMORY not only can improve software quality, but also can reduce the amount of system resources used to protect a system. We implemented ARMORY in Linux kernel by modifying sys_read() system call and entry. S which deals all system call. Experimental results show that ARMORY can automatically detect PBODs when programmers test the functionality of their programs. Mohamed Sarrab, Mahmoud Elbasir, Saleh Alnaeli, Towards a quality model of technical aspects for mobile learning services: An empirical investigation, Computers in Human Behavior, Volume 55, Part A, February 2016, Pages 100-112, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2015.09.003. (https://www.sciencedirect.com/science/article/pii/S0747563215301345) Abstract: Abstract Quality issues are commonly reported following the development of mobile learning applications. To evaluate and increase the chance of the successful development of new mobile learning products, the adoption of a complete and well-defined set of technical quality aspects for mobile learning development and their adoption in the education environment are proposed. This work describes a model that captures most abstract and generic technical aspects of mobile learning service quality, including availability, fast response times, flexibility, scalability, usability, maintainability, functionality, functionality, reliability, connectivity, performance, user interface and security. A set of technical quality aspects was developed following a literature study focussing on standards and guidelines for learning and mobile application software quality. The presented case studies point to a set of contextual technical quality factors that influence the choice of mobile learning application. The findings also indicate that there are causal relationships between learner satisfaction and the overall proposed model technical quality aspects. The model has a positive impact on overall learning process outcomes by evaluating the technical aspects while maintaining the quality of mobile learning delivered. The model components purportedly affect learning outcomes by assessing and improving the acceptability to stakeholders of the technical aspects of mobile learning. Keywords: Mobile learning; Software quality; Technical aspects; Requirements engineering; Human computing interaction Kamal Z. Zamli, Fakhrud Din, Salmi Baharom, Bestoun S. Ahmed, Fuzzy adaptive teaching learning-based optimization strategy for the problem of generating mixed strength t-way test suites, Engineering Applications of Artificial Intelligence, Volume 59, March 2017, Pages 35-50, ISSN 0952-1976, https://doi.org/10.1016/j.engappai.2016.12.014. (https://www.sciencedirect.com/science/article/pii/S095219761630241X) Abstract: Abstract The teaching learning-based optimization (TLBO) algorithm has shown competitive performance in solving numerous real-world optimization problems. Nevertheless, this algorithm requires better control for exploitation and exploration to prevent premature convergence (i.e., trapped in local optima), as well as enhance solution diversity. Thus, this paper proposes a new TLBO variant based on Mamdani fuzzy inference system, called ATLBO, to permit adaptive selection of its global and local search operations. In order to assess its performances, we adopt ATLBO for the mixed strength t-way test generation problem. Experimental results reveal that ATLBO exhibits competitive performances against the original TLBO and other meta-heuristic counterparts. Keywords: Software testing; t-way testing; Teaching learning-based optimization algorithm; Mamdani fuzzy inference system Rayford B Vaughn Jr., Julian E Boggess III, Integration of computer security into the software engineering and computer science programs, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 149-153, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00088-6. (https://www.sciencedirect.com/science/article/pii/S0164121299000886) Abstract: This paper presents a role for computer security education in a computer science curriculum and argues that it should become a standard course offering at both the undergraduate and graduate levels of instruction. Computer security instruction requires a fundamental computer science foundation and integrates nicely into the junior or senior year of study. Additionally, a typical computer security overview course tends to reinforce previously taught material – particularly in the areas of networks, operating systems, database, software engineering, computer hardware design/architecture, data communications, and artificial intelligence. It also introduces a wonderful forum from which ethical/societal issues can be discussed and debated. Keywords: Computer security; Curricula; Ethical/societal Chun-Hsi Huang, REU Site: Bio-Grid Initiatives for interdisciplinary research and education, Journal of Parallel and Distributed Computing, Volume 105, July 2017, Pages 174-182, ISSN 0743-7315, https://doi.org/10.1016/j.jpdc.2017.01.012. (https://www.sciencedirect.com/science/article/pii/S0743731517300187) Abstract: Abstract Recently more and more universities have been incorporating HPC (High Performance Computing) in their computing curriculum. The Bio-Grid REU (Research Experience for Undergraduates) Site offers undergraduate students interested or experienced in HPC a summer research opportunity to participate in projects that apply HPC in various life-science disciplines. The projects are associated with the Bio-Grid Initiatives conducted at the University of Connecticut. Training seminars are designed to equip students with background knowledge such as basic parallel programming, large-scale data analytics, and middleware support, etc., as well as some ongoing projects using these computing methods. Students participate in several collaborative projects supported by a campus-wide computational and data grid. The REU project introduces such interdisciplinary research work to students in the early stage of their academic career to spark their interest. The project aims at preparing future software engineers to formalize and solve emerging life-science problems, as well as life-science researchers with a strong background in high-performance computing. The Bio-Grid REU Site was supported by the National Science Foundation from 08–10 and 12–14, with a website located at http://biogrid.engr.uconn.edu/REU. Keywords: HPC; Computational biology; Grid and cloud computing Sanjay Misra, Ibrahim Akman, Hazan Daglayan, Informatics Related Branch's Curriculum and Role of Project Management, IERI Procedia, Volume 4, 2013, Pages 403-407, ISSN 2212-6678, https://doi.org/10.1016/j.ieri.2013.11.058. (https://www.sciencedirect.com/science/article/pii/S2212667813000610) Abstract: Abstract The most important goal of the software industry is to produce successful product. During the process of production several times the product fails due to lack of proper management. This paper is exploring the role of software engineering courses in computer engineering related branches and then reasons why software developers lack project management in proper software management trainings. Our findings reflect that in majority of computer related branches like computer science, computer engineering, information system engineering there is no place for software project management course. Our findings are based on a survey of course curriculums of computer engineering, computer science and information system engineering courses taught in Turkish universities. Keywords: Project Management; Software Industry; Computer Engineering; Computer Science Ryan Armstrong, Sandrine de Ribaupierre, Roy Eagleson, A software system for evaluation and training of spatial reasoning and neuroanatomical knowledge in a virtual environment, Computer Methods and Programs in Biomedicine, Volume 114, Issue 1, April 2014, Pages 29-37, ISSN 0169-2607, https://doi.org/10.1016/j.cmpb.2014.01.006. (https://www.sciencedirect.com/science/article/pii/S0169260714000078) Abstract: Abstract This paper describes the design and development of a software tool for the evaluation and training of surgical residents using an interactive, immersive, virtual environment. Our objective was to develop a tool to evaluate user spatial reasoning skills and knowledge in a neuroanatomical context, as well as to augment their performance through interactivity. In the visualization, manually segmented anatomical surface images of MRI scans of the brain were rendered using a stereo display to improve depth cues. A magnetically tracked wand was used as a 3D input device for localization tasks within the brain. The movement of the wand was made to correspond to movement of a spherical cursor within the rendered scene, providing a reference for localization. Users can be tested on their ability to localize structures within the 3D scene, and their ability to place anatomical features at the appropriate locations within the rendering. Keywords: Spatial reasoning; Neurosurgery; Software architecture; OpenGL; 3D input; Human–computer interfaces Mark van den Brand, Jan Friso Groote, Software engineering: Redundancy is key, Science of Computer Programming, Volume 97, Part 1, 1 January 2015, Pages 75-81, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2013.11.020. (https://www.sciencedirect.com/science/article/pii/S0167642313003043) Abstract: Abstract Software engineers are humans and so they make lots of mistakes. Typically 1 out of 10 to 100 tasks go wrong. The only way to avoid these mistakes is to introduce redundancy in the software engineering process. This article is a plea to consciously introduce several levels of redundancy for each programming task. Depending on the required level of correctness, expressed in a residual error probability (typically 10 − 3 to 10 − 10 ), each programming task must be carried out redundantly 4 to 8 times. This number is hardly influenced by the size of a programming endeavour. Training software engineers do have some effect as non-trained software engineers require a double amount of redundant tasks to deliver software of a desired quality. More compact programming, for instance by using domain specific languages, only reduces the number of redundant tasks by a small constant. Keywords: Software engineering; Software quality; Redundancy N. Sykes, S. Collins, A.B. Loving, V. Ricardo, E. Villedieu, Design for high productivity remote handling, Fusion Engineering and Design, Volume 86, Issues 9–11, October 2011, Pages 1843-1846, ISSN 0920-3796, https://doi.org/10.1016/j.fusengdes.2011.04.004. (https://www.sciencedirect.com/science/article/pii/S092037961100398X) Abstract: As the central part of a programme of enhancements in support of ITER, the Joint European Torus (JET) is being equipped with an all-metal wall. This enhancement programme requires the removal and installation of 6927 tile carriers and tiles, as well as the removal and installation of embedded diagnostics and antennas. The scale of this operation and the necessity to maximise operational availability of the facility added a requirement for high productivity in the remote activities to the existing exigencies of precision, reliability, cleanliness and operational security. This high productivity requirement has been incorporated into the design of the components and associated installation tooling, the design of the installation equipment, the development of installation procedures including the use of a mock-up for optimisation and training. Consideration of the remote handling installation process is vital during the design of the in vessel components. A number of features to meet the need of the high productivity while maintaining the function requirements have been incorporated into the metal wall components and associated tooling including kinematic design with guidance appropriate for remote operation. The component and tools are designed to guide the attachment of the installation tool, the installation path, and the interlocking with adjacent components without contact between the fragile castellated beryllium of the adjacent tiles. Other incorporated ergonomic features are discussed. At JET, the remote maintenance is conducted using end effectors, normally bi-lateral force feed back manipulator, mounted on driven, articulated booms. Prior to the current shutdown one long boom was used to conduct the installation and collect and deliver components to the “short” boom [3] which was linked to the tile carrier transfer facility. This led to loss of efficiency during these movements. The adoption of a new remote handling philosophy using ‘point of installation’ delivery of components via an additional long boom and a sophisticated logistics system based on ‘task modules’ is described and operational efficiencies detailed. The enabling programs and software behind this new approach needed significant development. Systems such as a task module manager database, image visualisation, virtual reality simulation, operational document system and teach files, which have significantly evolved since their initial inception in 2002, will be elaborated. Keywords: Remote handling; Productivity; Design; Tooling; Equipment; Software Simone S. Borges, Helena Macedo Reis, Leonardo B. Marques, Vinicius H.S. Durelli, Ig Ibert Bittencourt, Patrícia A. Jaques, Seiji Isotani, Reduced GUI for an interactive geometry software: Does it affect students' performance?, Computers in Human Behavior, Volume 54, January 2016, Pages 124-133, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2015.07.064. (https://www.sciencedirect.com/science/article/pii/S0747563215300765) Abstract: AbstractPurpose The purpose of this paper is to describe an experimental study to reduce cognitive load and enhance usability for interactive geometry software. Design/methodology/approach The Graphical User Interface is the main mechanism of communication between user and system features. Educational software interfaces should provide useful features to assist learners without generate extra cognitive load. In this context, this research aims at analyzing a reduced and a complete interface of interactive geometry software, and verifies the educational benefits they provide. We investigated whether a reduced interface makes few cognitive demands of users in comparison to a complete interface. To this end, we designed the interfaces and carried out an experiment involving 69 undergraduate students. Findings The experimental results indicate that an interface that hides advanced and extraneous features helps novice users to perform slightly better than novice users using a complete interface. After receiving proper training, however, a complete interface makes users more productive than a reduced interface. Originality/value In educational software, successful user interface designs minimize the cognitive load on users; thereby users can direct their efforts to maximizing their understanding of the educational concepts being presented. Keywords: Interactive geometry software; iGeom; Graphical user interface; Experimental study; Interactive learning environment Javier García, Antonio Amescua, María-Isabel Sánchez, Leonardo Bermón, Design guidelines for software processes knowledge repository development, Information and Software Technology, Volume 53, Issue 8, August 2011, Pages 834-850, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2011.03.002. (https://www.sciencedirect.com/science/article/pii/S0950584911000619) Abstract: Context Staff turnover in organizations is an important issue that should be taken into account mainly for two reasons: 1. Employees carry an organization’s knowledge in their heads and take it with them wherever they go 2. Knowledge accessibility is limited to the amount of knowledge employees want to share Objective The aim of this work is to provide a set of guidelines to develop knowledge-based Process Asset Libraries (PAL) to store software engineering best practices, implemented as a wiki. Method Fieldwork was carried out in a 2-year training course in agile development. This was validated in two phases (with and without PAL), which were subdivided into two stages: Training and Project. Results The study demonstrates that, on the one hand, the learning process can be facilitated using PAL to transfer software process knowledge, and on the other hand, products were developed by junior software engineers with a greater degree of independence. Conclusion PAL, as a knowledge repository, helps software engineers to learn about development processes and improves the use of agile processes. Keywords: Software engineering; Software process technology; Knowledge management; Agile development; Web 2.0; Wiki Stefan Wiesner, Sara Nilsson, Klaus-Dieter Thoben, Integrating Requirements Engineering for Different Domains in System Development – Lessons Learnt from Industrial SME Cases, Procedia CIRP, Volume 64, 2017, Pages 351-356, ISSN 2212-8271, https://doi.org/10.1016/j.procir.2017.03.013. (https://www.sciencedirect.com/science/article/pii/S2212827117301579) Abstract: Abstract There is a trending transition for companies from offering products to solutions in order to fulfill better customer needs and to reduce environmental impact by e.g. dematerialization. This solution-based development has an associated integration of intelligent devices that contributes to increasing system complexity. The ability of systems engineering processes, methods and tools to cope with these developments is a critical factor for manufacturing companies today. Still, in many cases it is hard to find adequately trained people and sufficiently integrated development tools for complex solutions, especially in the case of small and medium sized enterprises. Often, the tangible (hardware) part of the solution is primarily developed and the intangible parts (software and services) are added on top. However, key for a successful development is to adapt and integrate all parts according to the requirements set for the solution. Thus, it is essential how requirements are worked with during systems engineering and how they influence the development of the tangible and intangible parts of the solution. The objective of this paper is to study the approach of different industrial use cases for requirements engineering in system development. The aim is to identify how practices from domains like mechanical engineering, software or service engineering can be adapted for an integrated requirements engineering for complex systems, like product-service systems. Keywords: Systems Engineering; Requirements Engineering; Product-Service Systems; Industrial Case Study J.A. McDermid, Skills and Technologies for the Development and Evaluation of Safety Critical Systems, IFAC Proceedings Volumes, Volume 23, Issue 6, October–November 1990, Pages 163-171, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)52195-3. (https://www.sciencedirect.com/science/article/pii/S1474667017521953) Abstract: Abstract This paper briefly outlines some of the fundamental problems of producing computer based safety critical systems. It then gives the author's views on likely key developments in safety critical systems technologies over the next decade. Finally it addresses the skills needed by developers and evaluators of safety critical systems, relating education and training issues to the predicted advances in technology. Keywords: Safety; software development; software engineering; specification languages; program translators; formal languages; redundancy David Lizcano, Javier Soriano, Genoveva López, Javier J. Gutiérrez, Automatic verification and validation wizard in web-centred end-user software engineering, Journal of Systems and Software, Volume 125, March 2017, Pages 47-67, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2016.11.025. (https://www.sciencedirect.com/science/article/pii/S0164121216302278) Abstract: Abstract This paper addresses one of the major web end-user software engineering (WEUSE) challenges, namely, how to verify and validate software products built using a life cycle enacted by end-user programmers. Few end-user development support tools implement an engineering life cycle adapted to the needs of end users. End users do not have the programming knowledge, training or experience to perform development tasks requiring creativity. Elsewhere we published a life cycle adapted to this challenge. With the support of a wizard, end-user programmers follow this life cycle and develop rich internet applications (RIA) to meet specific end-user requirements. However, end-user programmers regard verification and validation activities as being secondary or unnecessary for opportunistic programming tasks. Hence, although the solutions that they develop may satisfy specific requirements, it is impossible to guarantee the quality or the reusability of this software either for this user or for other developments by future end-user programmers. The challenge, then, is to find means of adopting a verification and validation workflow and adding verification and validation activities to the existing WEUSE life cycle. This should not involve users having to make substantial changes to the type of work that they do or to their priorities. In this paper, we set out a verification and validation life cycle supported by a wizard that walks the user through test case-based component, integration and acceptance testing. This wizard is well-aligned with WEUSE's characteristic informality, ambiguity and opportunisticity. Users applying this verification and validation process manage to find bugs and errors that they would otherwise be unable to identify. They also receive instructions for error correction. This assures that their composite applications are of better quality and can be reliably reused. We also report a user study in which users develop web software with and without a wizard to drive verification and validation. The aim of this user study is to confirm the applicability and effectiveness of our wizard in the verification and validation of a RIA. Keywords: End-user software engineering; Web engineering; Reliability; End-user programming; visual programming; human-computer interaction Regina Colonia-Willner, Self-service systems: new methodology reveals customer real-time actions during merger, Computers in Human Behavior, Volume 20, Issue 2, March 2004, Pages 243-267, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2003.10.017. (https://www.sciencedirect.com/science/article/pii/S074756320300089X) Abstract: Automatic teller machines (ATMs) being developed today may astonish customers. Some of these sophisticated new ATMs deliver full banking services, dispense theater tickets, and cruise the World Wide Web. What may be more striking is the fact that, since the inception of ATMs 30 years ago, bankers, developers, and deployers have not had precise information on how clients perform in real time when using even the simplest ones. Consequently, from the point of view of cognitive research, it is not surprising that, although the banking industry has considerable capital investment tied up in ATMs, banks do not obtain the desired returns. ATMs would need to be utilized more often and for more profitable transactions, which generally means more cognitively demanding ones, in order to bring in profits. To accomplish that mission, designers and bankers need precise information about what customers really do when using the machines. This paper discusses a study conducted with a new methodology that allows the capture of relevant, detailed, and real-time data from electronic self-service systems. That same methodology can be applied to businesses as diverse as airlines, hotels, internet transactions, and many others. In our study, the data were collected during the challenging times of a merger of two large international banks that jointly have over 550,000 customers and 1000 ATMs. A sample of 15,099 users, ages 16–79 years old, participated. These customers performed 60,259 financial transactions during their 44,435 visits to 103 ATMs in 22 days. Accuracy, as measured by success and errors in obtaining the desired outcome, and time spent with input, measured on-line in milliseconds, were the dependent variables. The data were cross-referenced to identify education, gender, age, experience, and other demographic variables. The implications of the findings for the study of human–computer interaction, practical intelligence, and procedural knowledge are discussed. Keywords: Self-service; Software design; Knowledge transfer; ATM; Human–computer interaction; Practical intelligence; Procedural and tacit knowledge; Cognitive aging Birgit Penzenstadler, Sustainability analysis and ease of learning in artifact-based requirements engineering: The newest member of the family of studies (It’s a girl!), Information and Software Technology, Volume 95, March 2018, Pages 130-146, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2017.11.011. (https://www.sciencedirect.com/science/article/pii/S0950584917303488) Abstract: Abstract Context: Artifact-based requirements engineering promises to deliver results of high quality while allowing for flexibility in the development process and the project settings. Tailored for analyzing sustainability, it can offer tangible insights on potential benefits and risks of a system under development. However, as of now there is still relatively little empirical evidence available that would prove this quality, flexibility, and insight potential. Previous studies, specifically on the first two characteristics, differ in their socio-economic contexts and make the findings hard to generalize. Objective: Our goal is to investigate the advantages and limitations in the application of artifact-based requirements engineering by new, inexperienced requirements engineers to extend our family of studies. In addition, the secondary goal is to evaluate the suitability of the sustainability analysis artifact for a sustainability analysis of the system planned for development. Method: We report on a new member in a family of studies with 20 participants for evaluating artifact models in a sustainability application context. We use a graduate block course as case. Our data collection is performed via survey at the end of the course, based on the same instrument used in previous studies, and extended with a new section on evaluating the suitability of a particular artifact for sustainability analysis. Results: Both from the quantitative and the qualitative feedback, the results indicate that the students have benefitted from the artifact-based approach to analyzing sustainability in requirements engineering. Usability, syntactic and semantic quality were all rated high and the rationales were positive, as was the feedback on the sustainability analysis artifact. Conclusion: The results contribute to a reliable database on artifact-oriented requirements engineering and strengthen our confidence in the general benefits of artifact-orientation. Relating the old and new data provides some more insight into the trajectory of the wider transfer of artifact-based requirements engineering into practice. Keywords: Sustainability; Requirements engineering; Analysis; Empirical study; Family of studies; Evaluation research; Artifact orientation Jean-Christophe Blaise, Eric Levrat, Benoit Iung, Process approach-based methodology for safe maintenance operation: From concepts to SPRIMI software prototype, Safety Science, Volume 70, December 2014, Pages 99-113, ISSN 0925-7535, https://doi.org/10.1016/j.ssci.2014.05.008. (https://www.sciencedirect.com/science/article/pii/S0925753514001131) Abstract: Abstract Maintenance can be considered today as the main enabling system to sustain a target physical item – a workplace, a work equipment or means of transport – in a state in which it can perform the required function. In that way, whatever the sector is, workers carrying out maintenance activities are exposed to various hazards (e.g. chemical, physical, biological or psychosocial) that may be at risk of developing musculoskeletal disorders, diseases, etc. and occupational accidents (e.g. falls through or off something). Indeed maintenance can affect the health and safety not only of the workers directly involved in it, but of other people present in the workplace. To face this maintenance risk issue, risk assessment/management approaches are conventionally conducted by considering human, organisational or technical directions. Nevertheless such approaches are often not enough efficient because too focused on one direction without taking into account all its interactions with the others. Thus this paper presents a generic integrated risk management approach to maintenance which is based on a generic formalisation of maintenance (intervention) business processes/activities but also of their requirements more dedicated to health and safety. Then the approach and its resulting models have been automated on a tool called SPRIMI (software engineering) to be usable for information, support, training and design of safe maintenance system. Keywords: Safe maintenance; Occupational safety; Risk management; Process approach; SPRIMI software Todd Sedano, Cécile Péraire, Using Essence Reflection Meetings in Team-based Project Courses, Procedia Computer Science, Volume 62, 2015, Pages 15-16, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2015.08.400. (https://www.sciencedirect.com/science/article/pii/S1877050915025351) Abstract: Background Many software engineering curriculum contain a team-based project course. This is the case of Carnegie Mellon University Silicon Valley's masters of science in software engineering. In this context, we have been using Essence Reflection Meetings for five semesters with 17 teams and approximately 70 students. During these meetings, the teams reflect on various project's dimensions based on a systems thinking framework. The positive results have been published in research papers. Activity and Discussions Participants will learn about Essence Reflection Meetings for team-based project courses by practicing in a classroom environment. They will discuss challenges and solutions for team-based project courses, and how the proposed approach could potentially be leveraged in their own teaching environment. Organization We will start the tutorial with a discussion revealing the participants positive and negative experiences with team-based projects. After briefly introducing the Essence's systems thinking framework and our research results, we will use hands-on training exercises to demonstrate how to use the approach. This will be followed with guided debriefing. Finally, we will go deeper into the Essence framework, and discuss our research results and their applicability in various teaching environments. Learning Objectives By the end of the tutorial, participants will be familiar with a systems thinking framework that they can leverage to coach their students teams and monitor their progress. They will be able to articulate the pros and cons of applying the approach in their own teaching environment. Rupal Patel, Catherine McNab, Displaying prosodic text to enhance expressive oral reading, Speech Communication, Volume 53, Issue 3, March 2011, Pages 431-441, ISSN 0167-6393, https://doi.org/10.1016/j.specom.2010.11.007. (https://www.sciencedirect.com/science/article/pii/S0167639310002050) Abstract: This study assessed the effectiveness of software designed to facilitate expressive oral reading through text manipulations that convey prosody. The software presented stories in standard (S) and manipulated formats corresponding to variations in fundamental frequency (F), intensity (I), duration (D), and combined cues (C) indicating modulation of pitch, loudness and length, respectively. Ten early readers (mean age = 7.6 years) attended three sessions. During the first session, children read two stories in standard format to establish a baseline. The second session provided training and practice in the manipulated formats. In the third, post-training session, sections of each story were read in each condition (S, F, I, D, C in random order). Recordings were acoustically examined for changes in word duration, peak intensity and peak F0 from baseline to post-training. When provided with pitch cues (F), children increased utterance-wide peak F0 range (mean = 34.5 Hz) and absolute peak F0 for accented words. Pitch cues were more effective in isolation (F) than in combination (C). Although Condition I elicited increased intensity of salient words, Conditions S and D had minimal impact on prosodic variation. Findings suggest that textual manipulations conveying prosody can be readily learned by children to improve reading expressivity. Keywords: Children; Oral reading; Prosody; Reading software; Expressive reading S Saukkonen, Software project management education —experiences of training full-time professionals, Annual Review in Automatic Programming, Volume 16, Part 2, 1992, Pages 139-142, ISSN 0066-4138, https://doi.org/10.1016/0066-4138(92)90022-H. (https://www.sciencedirect.com/science/article/pii/006641389290022H) Abstract: The experiences gained from planning and implementing two evolutionary steps in a systematic training programme for young software project managers from a group of companies are presented, and the impacts of the programme on the growth of the participating companies in terms of turnover and number of staff are discussed. The basic ideas of planning were adopted from research carried out into curriculum design for systems analysts and continuing education for software engineers. In our experience the core topics of the courses should be taken from traditional software project management together with courses in leadership and the fundamentalsof software business and marketing. The course modules should be very short together with personal exercises that are closely tied to the day-to-day work of the participants. Commitment of the company to the training scheme can be ensured by using a careful analysis of its software process maturity when selecting the exercises. Pratik Roy, G.S. Mahapatra, K.N. Dey, Neuro-genetic approach on logistic model based software reliability prediction, Expert Systems with Applications, Volume 42, Issue 10, 15 June 2015, Pages 4709-4718, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2015.01.043. (https://www.sciencedirect.com/science/article/pii/S0957417415000585) Abstract: Abstract In this paper, we propose a multi-layer feedforward artificial neural network (ANN) based logistic growth curve model (LGCM) for software reliability estimation and prediction. We develop the ANN by designing different activation functions for the hidden layer neurons of the network. We explain the ANN from the mathematical viewpoint of logistic growth curve modeling for software reliability. We also propose a neuro-genetic approach for the ANN based LGCM by optimizing the weights of the network using proposed genetic algorithm (GA). We first train the ANN using back-propagation algorithm (BPA) to predict software reliability. After that, we use the proposed GA to train the ANN by globally optimizing the weights of the network. The proposed ANN based LGCM is compared with the traditional Non-homogeneous Poisson process (NHPP) based software reliability growth models (SRGMs) and ANN based software reliability models. We present the comparison between the two training algorithms when they are applied to train the proposed ANN to predict software reliability. The applicability of the different approaches is explained through three real software failure data sets. Experimental results demonstrate that the proposed ANN based LGCM has better fitting and predictive capability than the other NHPP and ANN based software reliability models. It is also noted that when the proposed GA is employed as the learning algorithm to the ANN, the proposed ANN based LGCM gives more fitting and prediction accuracy i.e. the proposed neuro-genetic approach to the LGCM provides utmost predictive validity. Proposed model can be applied during software testing time to get better software reliability estimation and prediction than the other traditional NHPP and ANN based software reliability models. Keywords: Artificial neural network; Genetic algorithm; Back-propagation algorithm; Logistic growth curve model; Software reliability; Prediction James E. Tomayko, Lessons learned teaching Ada in the context of software engineering, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 281-283, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90075-7. (https://www.sciencedirect.com/science/article/pii/0164121289900757) Abstract: Educators across the country are struggling with difficult issues in the teaching of Ada and its relationship to the computer science curriculum. By design, the language supports software engineering principles. Therefore, it would seem that the “natural” place for teaching Ada is within the context of software engineering. This paper reports on the author's and his students' experiences in learning and using Ada in different settings, including a software engineering project course, and a course centered on Ada and its use. Wen-Hsiang Shen, Nien-Lin Hsueh, Wei-Mann Lee, Assessing PSP effect in training disciplined software development: A Plan–Track–Review model, Information and Software Technology, Volume 53, Issue 2, February 2011, Pages 137-148, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2010.09.004. (https://www.sciencedirect.com/science/article/pii/S0950584910001710) Abstract: Context In training disciplined software development, the PSP is said to result in such effect as increased estimation accuracy, better software quality, earlier defect detection, and improved productivity. But a systematic mechanism that can be easily adopted to assess and interpret PSP effect is scarce within the existing literature. Objective The purpose of this study is to explore the possibility of devising a feasible assessment model that ties up critical software engineering values with the pertinent PSP metrics. Method A systematic review of the literature was conducted to establish such an assessment model (we called a Plan–Track–Review model). Both mean and median approaches along with a set of simplified procedures were used to assess the commonly accepted PSP training effects. A set of statistical analyses further followed to increase understanding of the relationships among the PSP metrics and to help interpret the application results. Results Based on the results of this study, PSP training effect on the controllability, manageability, and reliability of a software engineer is quite positive and largely consistent with the literature. However, its effect on one’s predictability on project in general (and on project size in particular) is not implied as said in the literature. As for one’s overall project efficiency, our results show a moderate improvement. Our initial finding also suggests that a prior stage PSP effect could have an impact on later stage training outcomes. Conclusion It is concluded that this Plan–Track–Review model with the associated framework can be used to assess PSP effect regarding a disciplined software development. The generated summary report serves to provide useful feedback for both PSP instructors and students based on internal as well as external standards. Keywords: Personal software process (PSP); Software process improvement (SPI); Effect assessment; Plan–Track–Review Eugenio Parra, Christos Dimou, Juan Llorens, Valentín Moreno, Anabel Fraga, A methodology for the classification of quality of requirements using machine learning techniques, Information and Software Technology, Volume 67, November 2015, Pages 180-195, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2015.07.006. (https://www.sciencedirect.com/science/article/pii/S0950584915001299) Abstract: AbstractContext One of the most important factors in the development of a software project is the quality of their requirements. Erroneous requirements, if not detected early, may cause many serious problems, such as substantial additional costs, failure to meet the expected objectives and delays in delivery dates. For these reasons, great effort must be devoted in requirements engineering to ensure that the project’s requirements results are of high quality. One of the aims of this discipline is the automatic processing of requirements for assessing their quality; this aim, however, results in a complex task because the quality of requirements depends mostly on the interpretation of experts and the necessities and demands of the project at hand. Objective The objective of this paper is to assess the quality of requirements automatically, emulating the assessment that a quality expert of a project would assess. Method The proposed methodology is based on the idea of learning based on standard metrics that represent the characteristics that an expert takes into consideration when deciding on the good or bad quality of requirements. Using machine learning techniques, a classifier is trained with requirements earlier classified by the expert, which then is used for classifying newly provided requirements. Results We present two approaches to represent the methodology with two situations of the problem in function of the requirement corpus learning balancing, obtaining different results in the accuracy and the efficiency in order to evaluate both representations. The paper demonstrates the reliability of the methodology by presenting a case study with requirements provided by the Requirements Working Group of the INCOSE organization. Conclusions A methodology that evaluates the quality of requirements written in natural language is presented in order to emulate the quality that the expert would provide for new requirements, with 86.1 of average in the accuracy. Keywords: Software engineering; Requirements engineering; Requirements quality; Machine learning Cuauhtémoc López-Martín, Predictive accuracy comparison between neural networks and statistical regression for development effort of software projects, Applied Soft Computing, Volume 27, February 2015, Pages 434-449, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2014.10.033. (https://www.sciencedirect.com/science/article/pii/S1568494614005456) Abstract: Abstract To get a better prediction of costs, schedule, and the risks of a software project, it is necessary to have a more accurate prediction of its development effort. Among the main prediction techniques are those based on mathematical models, such as statistical regressions or machine learning (ML). The ML models applied to predicting the development effort have mainly based their conclusions on the following weaknesses: (1) using an accuracy criterion which leads to asymmetry, (2) applying a validation method that causes a conclusion instability by randomly selecting the samples for training and testing the models, (3) omitting the explanation of how the parameters for the neural networks were determined, (4) generating conclusions from models that were not trained and tested from mutually exclusive data sets, (5) omitting an analysis of the dependence, variance and normality of data for selecting the suitable statistical test for comparing the accuracies among models, and (6) reporting results without showing a statistically significant difference. In this study, these six issues are addressed when comparing the prediction accuracy of a radial Basis Function Neural Network (RBFNN) with that of a regression statistical (the model most frequently compared with ML models), to feedforward multilayer perceptron (MLP, the most commonly used in the effort prediction of software projects), and to general regression neural network (GRNN, a RBFNN variant). The hypothesis tested is the following: the accuracy of effort prediction for RBFNN is statistically better than the accuracy obtained from a simple linear regression (SLR), MLP and GRNN when adjusted function points data, obtained from software projects, is used as the independent variable. Samples obtained from the International Software Benchmarking Standards Group (ISBSG) Release 11 related to new and enhanced projects were used. The models were trained and tested from a leave-one-out cross-validation method. The criteria for evaluating the models were based on Absolute Residuals and by a Friedman statistical test. The results showed that there was a statistically significant difference in the accuracy among the four models for new projects, but not for enhanced projects. Regarding new projects, the accuracy for RBFNN was better than for a SLR at the 99% confidence level, whereas the MLP and GRNN were better than for a SLR at the 90% confidence level. Keywords: Software development effort prediction; Radial Basis Function Neural Network; Feedforward multilayer perceptron; General regression neural network; Statistical regression; ISBSG data set Wen-Hsiung Wu, Wen-Cheng Yan, Hao-Yun Kao, Wei-Yang Wang, Yen-Chun Jim Wu, Integration of RPG use and ELC foundation to examine students’ learning for practice, Computers in Human Behavior, Volume 55, Part B, February 2016, Pages 1179-1184, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2014.10.023. (https://www.sciencedirect.com/science/article/pii/S0747563214005469) Abstract: Abstract Regarding the issue of role-playing games (RPG) and the experiential learning cycle (ELC), the integration of RPG use as a pedagogical and simulation tool for practice and ELC as a learning theoretical foundation is essential for promoting students’ effective learning. However, few studies have applied RPG to simulate the practice with the ELC stages, namely, concrete experience (CE), reflective observation (RO), abstract conceptualization, (AC) and active experimentation (AE), to examine the learning process and further enhance the effective learning outcomes for learners. This study integrates the RPG development and use for practice derived from the ELC’s four stages based on practising the project assessment of software development in a software engineering course. The results show a significant improvement in students’ learning outcomes after RPG use. More importantly, this study provides the major activities and findings of each ELC stage via RPG use and the mapping of RPG activities with ELC stages. The insightful implications and suggestions of this study are discussed. Keywords: Role-playing game; Experiential learning theory; Experiential learning cycle; Learning performance; Practice-based learning Ralf Dörner, Paul Grimm, Daniel F. Abawi, Synergies between interactive training simulations and digital storytelling: a component-based framework, Computers & Graphics, Volume 26, Issue 1, February 2002, Pages 45-55, ISSN 0097-8493, https://doi.org/10.1016/S0097-8493(01)00177-7. (https://www.sciencedirect.com/science/article/pii/S0097849301001777) Abstract: A vital requirement for a successful software framework for digital storytelling is that it takes the abilities and background of the story authors into account. Dedicated tools should support authors in expressing their stories within this framework at an adequate level and point out an according authoring process for digital stories. The software framework should provide communication interfaces between technology experts, storytelling experts and application domain-experts. These requirements are similar to the ones already encountered when setting up a framework for interactive training applications. We present a concept how component and framework methodologies from software engineering as well as concepts from artificial intelligence can foster the design of such a software framework. The software architecture of our proposed framework is discussed as well as the according authoring process and tools. An implementation of our concept is described and lessons learned during using this framework in the application domain of emergency training are addressed. Although the framework has been applied for training purposes in particular, it can be used as a basis for a digital storytelling framework in general. Paul Grünbacher, Norbert Seyff, Robert O. Briggs, Hoh Peter In, Hasan Kitapci, Daniel Port, Making every student a winner: The WinWin approach in software engineering education, Journal of Systems and Software, Volume 80, Issue 8, August 2007, Pages 1191-1200, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2006.09.049. (https://www.sciencedirect.com/science/article/pii/S0164121206002937) Abstract: This paper shows how Theory-W and the WinWin requirements negotiation approach are used in software engineering education at several universities in the US, Europe, and Asia. We briefly describe Theory-W, the WinWin negotiation model, available processes, and tool support. We then discuss how students can benefit from WinWin in their software engineering education. We explore different options for teaching the approach and present concrete examples and experiences from the different universities. Keywords: Theory-W; Stakeholder involvement; Requirements engineering Roberto Pérez-Rodríguez, Luis Anido-Rifón, Miguel Gómez-Carballa, Marcos Mouriño-García, Architecture of a concept-based information retrieval system for educational resources, Science of Computer Programming, Volume 129, 1 November 2016, Pages 72-91, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2016.05.005. (https://www.sciencedirect.com/science/article/pii/S0167642316300314) Abstract: Abstract Internet searches that occur in learning contexts are very different in nature from traditional “lookup” or “known item” searches: students usually perform searches to gather information about or master a certain topic, and the search engine is used as an aid in the exploration of a domain of knowledge. This paper presents SDE (Search Discover Explore), an exploratory search engine for educational resources that was built on top of the knowledge provided by Wikipedia: the set of its articles provides the search space (the set of topics that users can investigate), and the relationships between Wikipedia articles inform the suggestions that the search engine provides to students to go deeper in the exploration of a certain domain of knowledge. SDE indexes several hundreds of thousands of educational resources from high-quality Web sources, such as Project Gutenberg and Open Education Europe, among many others. This paper also reports the results of the evaluation of SDE by experts in Technology Enhanced Learning in several workshops that took place across Europe in the context of the European FP7 project iTEC. These results enable us to conclude that the exploratory search paradigm, making use of knowledge mined from Wikipedia, is a very promising approach for building information retrieval systems to be used in learning contexts. Keywords: Exploratory search; Information retrieval; Bag-of-concepts (BoC) representation; Software architecture Bruno de Sousa Monteiro, Alex Sandro Gomes, Francisco Milton Mendes Neto, Youubi: Open software for ubiquitous learning, Computers in Human Behavior, Volume 55, Part B, February 2016, Pages 1145-1164, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2014.09.064. (https://www.sciencedirect.com/science/article/pii/S0747563214005226) Abstract: Abstract Popularization of mobile and personalized services motivates the adoption of learning strategies supported by the principles of ubiquitous computing. However, because it is a new field, there is a perceived lack of ubiquitous learning environments, based on reference architectures, and open source software. Against this backdrop, this article aims to present the Youubi; one u-learning environment that was developed as a component-oriented reference architecture, applied to the context of formal and informal learning. For validation, the Youubi was used and installed by undergraduate’s students and teachers in their smartphones. The method applied in this research includes design process and quantitative and qualitative analysis techniques, with the goal of identifying scenarios of ubiquitous learning and realize the impressions of students and teachers about the playful and motivational aspects, and its contribution to learning. Keywords: Ubiquitous learning; Ubiquitous computing; Software engineering; Design interactive; Gamification K.C. Aw, S.Q. Xie, E. Haemmerle, A FPGA-based rapid prototyping approach for teaching of Mechatronics Engineering, Mechatronics, Volume 17, Issue 8, October 2007, Pages 457-461, ISSN 0957-4158, https://doi.org/10.1016/j.mechatronics.2007.05.001. (https://www.sciencedirect.com/science/article/pii/S0957415807000487) Abstract: The degree of Mechatronics Engineering was first introduced in 2002 at the University of Auckland by the Department of Mechanical Engineering. Teaching such a degree requires an integrated approach to topics such as applied mechatronics, mechanical design, control and software engineering. The use of field programmable gate array (FPGA) with associated software enables students to learn various digital design techniques, simulate and quickly implement them into the design of mechatronics systems. This paper presents a FPGA-based rapid prototyping platform for teaching applied electronics, hardware description languages and motor control concepts. The platform is based on Altera’s University Program UP2 Development Kit. The approach used in the application of FPGA is to accurately reflect current practice in industry rather than the more traditional TTL or CMOS chips and breadboard based approach to quickly prototype a digital system. Keywords: Rapid prototyping; FPGA; Mechatronics design Giuseppe Scanniello, Ugo Erra, Distributed modeling of use case diagrams with a method based on think-pair-square: Results from two controlled experiments, Journal of Visual Languages & Computing, Volume 25, Issue 4, August 2014, Pages 494-517, ISSN 1045-926X, https://doi.org/10.1016/j.jvlc.2014.03.002. (https://www.sciencedirect.com/science/article/pii/S1045926X14000329) Abstract: Abstract Objective: In this paper, we present the results of two controlled experiments conducted to assess a new method based on think-pair-square in the distributed modeling of use case diagrams. Methods: This new method has been implemented within an integrated environment, which allows distributed synchronous modeling and communication among team members. To study the effect of the participants׳ familiarity with the method and the integrated environment, the second experiment is a replication conducted with the same participants as the original experiment. The results show a significant difference in favor of face-to-face (i.e., the chosen baseline) for the time to complete modeling tasks, with no significant impact on the quality of the produced models. Results: The results on participants׳ familiarity indicate a significant effect on the task completion time (i.e., more familiar participants spent less time), with no significant impact on quality. Practice: One of the most interesting practical implications of our study is - in case the time difference is not an issue, but moving people might be a problem, the new method and environment could represent a viable alternative to face-to-face. Another significant result is that also people not perfectly trained on our method and environment may benefit from their use: the training phase could be shortened or skipped. In addition, face-to-face is less prone to consolidate participants׳ working style and to develop a shared working habit of participants. Implications: This work is in the direction of the media-effect theories applied to requirements engineering. The results indicate that the participants in the experiments significantly spent less time when modeling use case diagrams using face-to-face. Conversely, no significant difference was observed on the quality of the artifacts produced by the participants in the these tasks. Keywords: Experiments; Distributed Software Development; Requirements Modeling Vahid Garousi, Junji Zhi, A survey of software testing practices in Canada, Journal of Systems and Software, Volume 86, Issue 5, May 2013, Pages 1354-1376, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2012.12.051. (https://www.sciencedirect.com/science/article/pii/S0164121212003561) Abstract: Software testing is an important activity in the software development life-cycle. In an earlier study in 2009, we reported the results of a regional survey of software testing practices among practitioners in the Canadian province of Alberta. To get a larger nationwide view on this topic (across Canada), we conducted a newer survey with a revised list of questions in 2010. Compared to our previous Alberta-wide survey (53 software practitioners), the nation-wide survey had larger number of participants (246 practitioners). We report the survey design, execution and results in this article. The survey results reveal important and interesting findings about software testing practices in Canada. Whenever possible, we also compare the results of this survey to other similar studies, such as the ones conducted in the US, Sweden and Australia, and also two previous Alberta-wide surveys, including our 2009 survey. The results of our survey will be of interest to testing professionals both in Canada and world-wide. It will also benefit researchers in observing the latest trends in software testing industry identifying the areas of strength and weakness, which would then hopefully encourage further industry-academia collaborations in this area. Among the findings are the followings: (1) the importance of testing-related training is increasing, (2) functional and unit testing are two common test types that receive the most attention and efforts spent on them, (3) usage of the mutation testing approach is getting attention among Canadian firms, (4) traditional Test-last Development (TLD) style is still dominating and a few companies are attempting the new development approaches such as Test-Driven Development (TDD), and Behavior-Driven Development (BDD), (5) in terms of the most popular test tools, NUnit and Web application testing tools overtook JUnit and IBM Rational tools, (6) most Canadian companies use a combination of two coverage metrics: decision (branch) and condition coverage, (7) number of passing user acceptance tests and number of defects found per day (week or month) are regarded as the most important quality assurance metrics and decision factors to release, (8) in most Canadian companies, testers are out-numbered by developers, with ratios ranging from 1:2 to 1:5, (9) the majority of Canadian firms spent less than 40% of their efforts (budget and time) on testing during development, and (10) more than 70% of respondents participated in online discussion forums related to testing on a regular basis. Keywords: Survey; Software testing; Industry practices; Canada Victor Berdonosov, Elena Redkolis, TRIZ-fractality of computer-aided software engineering systems, Procedia Engineering, Volume 9, 2011, Pages 199-213, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2011.03.112. (https://www.sciencedirect.com/science/article/pii/S1877705811001299) Abstract: Authors of the present paper examined near hundred of Computer-Aided Software Engineering (CASE) systems. It is offered to consider CASE-systems evolution in the form of TRIZ-fractal matrix. Herewith the motivating force of evolution is resolution of contradictions which have appeared at the previous stages except that contradictions are resolved using TRIZ tools. Criteria which are connected with TRIZ concept “ideality” and according to which CASE-systems develop are practicality and investment. In the paper it is singled out CASE-systems development lines, their advantages and disadvantages and is analyzed purpose and development trend of each line. Usage of the present approach for training allows to reduce significantly time for learning different CASE-systems by means of knowledge systematization. On the other side this systematization will allow first to find out the priority of following development of CASE-systems, second to simplify significantly a choice of CASE-systems being used at enterprises and third to approve TRIZ tools application for contradictions resolution in CASE-systems. Keywords: CASE; Software life cycle; Process-oriented approach; TRIZ-fractality; Systematization of knowledge Orit Hazzan, The reflective practitioner perspective in software engineering education, Journal of Systems and Software, Volume 63, Issue 3, 15 September 2002, Pages 161-171, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(02)00012-2. (https://www.sciencedirect.com/science/article/pii/S0164121202000122) Abstract: This paper focuses on the application of the reflective practitioner (RP) perspective to the profession of software engineering (SE). The RP perspective guides professional people to rethink their professional creations during and after the accomplishment of the creation process. Analysis of the field of SE supports the adoption of the RP perspective to SE in general and to SE education in particular. The RP perspective emphasizes the studio––the basic training method in architecture schools––as the educational environment for design studies. In such studios students develop projects with a close guidance of a tutor. Analysis of the kind of tasks that architecture students are working on and a comparison of these tasks to the problems that SE students are facing, suggest that the studio may be an appropriate teaching method in SE as well. The paper presents the main ideas of the RP perspective and examines its fitness to SE in general and to SE education in particular. The discussion is based on analysis of the RP perspective and of the SE profession, visits to architecture studios, and conversations with tutors in architecture studios and with computing science practitioners. José M. Chaves-González, Miguel A. Pérez-Toledano, Amparo Navasa, Teaching learning based optimization with Pareto tournament for the multiobjective software requirements selection, Engineering Applications of Artificial Intelligence, Volume 43, August 2015, Pages 89-101, ISSN 0952-1976, https://doi.org/10.1016/j.engappai.2015.04.002. (https://www.sciencedirect.com/science/article/pii/S0952197615000834) Abstract: Abstract Software requirements selection is a problem which consists of choosing the set of new requirements which will be included in the next release of a software package. This NP-hard problem is an important issue involving several contradictory objectives which have to be tackled by software companies when developing new releases of software packages. Software projects have to stick to a budget, but they also have to satisfy the highest number of customer requirements. Furthermore, when managing real instances of the problem, the requirements tackled suffer interactions and other restrictions which make the problem even harder. In this paper, a novel multi-objective teaching learning based optimization (TLBO) algorithm has been successfully applied to several instances of the problem. For doing this, the software requirements selection problem has been formulated as a multiobjective optimization problem with two objectives: the total software development cost and the overall customer׳s satisfaction. In addition, three interaction constraints have been also managed. In this context, the original TLBO algorithm has been adapted to solve real instances of the problem generated from data provided by experts. Numerical experiments with case studies on software requirements selection have been carried out in order to prove the effectiveness of the multiobjective proposal. In fact, the obtained results show that the developed algorithm performs better than other relevant algorithms previously published in the literature. Keywords: Software requirements selection; Multi-objective evolutionary algorithm; Teaching learning based optimization; Search Based Software Engineering; Next Release Problem; Swarm intelligence Paul Ralph, The two paradigms of software development research, Science of Computer Programming, Volume 156, 1 May 2018, Pages 68-89, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2018.01.002. (https://www.sciencedirect.com/science/article/pii/S0167642318300030) Abstract: Abstract The most profound conflict in software engineering is not between positivist and interpretivist research approaches or Agile and Heavyweight software development methods, but between the Rational and Empirical Design Paradigms. The Rational and Empirical Paradigms are disparate constellations of beliefs about how software is and should be created. The Rational Paradigm remains dominant in software engineering research, standards and curricula despite being contradicted by decades of empirical research. The Rational Paradigm views analysis, design and programming as separate activities despite empirical research showing that they are simultaneous and inextricably interconnected. The Rational Paradigm views developers as executing plans despite empirical research showing that plans are a weak resource for informing situated action. The Rational Paradigm views success in terms of the Project Triangle (scope, time, cost and quality) despite empirical researching showing that the Project Triangle omits critical dimensions of success. The Rational Paradigm assumes that analysts elicit requirements despite empirical research showing that analysts and stakeholders co-construct preferences. The Rational Paradigm views professionals as using software development methods despite empirical research showing that methods are rarely used, very rarely used as intended, and typically weak resources for informing situated action. This article therefore elucidates the Empirical Design Paradigm, an alternative view of software development more consistent with empirical evidence. Embracing the Empirical Paradigm is crucial for retaining scientific legitimacy, solving numerous practical problems and improving software engineering education. Keywords: Empiricism; Rationalism; Philosophy of science; Software design; Empirical software engineering Iker Gondra, Applying machine learning to software fault-proneness prediction, Journal of Systems and Software, Volume 81, Issue 2, February 2008, Pages 186-195, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.05.035. (https://www.sciencedirect.com/science/article/pii/S0164121207001240) Abstract: The importance of software testing to quality assurance cannot be overemphasized. The estimation of a module’s fault-proneness is important for minimizing cost and improving the effectiveness of the software testing process. Unfortunately, no general technique for estimating software fault-proneness is available. The observed correlation between some software metrics and fault-proneness has resulted in a variety of predictive models based on multiple metrics. Much work has concentrated on how to select the software metrics that are most likely to indicate fault-proneness. In this paper, we propose the use of machine learning for this purpose. Specifically, given historical data on software metric values and number of reported errors, an Artificial Neural Network (ANN) is trained. Then, in order to determine the importance of each software metric in predicting fault-proneness, a sensitivity analysis is performed on the trained ANN. The software metrics that are deemed to be the most critical are then used as the basis of an ANN-based predictive model of a continuous measure of fault-proneness. We also view fault-proneness prediction as a binary classification task (i.e., a module can either contain errors or be error-free) and use Support Vector Machines (SVM) as a state-of-the-art classification method. We perform a comparative experimental study of the effectiveness of ANNs and SVMs on a data set obtained from NASA’s Metrics Data Program data repository. Keywords: Software testing; Software metrics; Fault-proneness; Machine learning; Neural network; Sensitivity analysis; Support vector machine M. Zaki, Hany Harb, T.S. Sobh, A learning database system to observe malfunctions and to support network planning, Journal of Systems and Software, Volume 58, Issue 1, 15 August 2001, Pages 33-46, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(01)00026-7. (https://www.sciencedirect.com/science/article/pii/S0164121201000267) Abstract: This paper presents a learning database system that can accommodate malfunction observations. Consequently, such observations may be expressed in structured patterns to support network planing which is one of the important network management functions. The underlying system monitors the network protocol tables in order to discover interesting patterns. To achieve this purpose two learning techniques are used. The first technique is empirical and it focuses on data samples by selecting specific fields and subsets of records using structured query language (SQL). Then data abstraction is carried out and interesting characteristics are extracted. The second technique exploits an explanation_based learning (EBL) procedure to obtain operational rules. In this case the domain (network) knowledge is formally expressed and only one training example is analyzed in terms of this knowledge. Thus, the system is capable of discovering various operational patterns, provide sensible advices, and support the network planning activity. Since the monitoring database utilizes a relational model, an integrated computer-aided software engineering (I-CASE) is used throughout the requirement identification, analysis and design phases. Accordingly, the quality of the database system as an engineering product has been achieved. Moreover, the open database connectivity (ODBC) approach is employed in order to provide an efficient interface that allows a client application to access a variety of distributed data sources in addition to its local database. Keywords: Network management; Database systems; Open database connectivity; Pattern discovery; Abstraction; Explanation_based learning Kamal Z. Zamli, Basem Y. Alkazemi, Graham Kendall, A Tabu Search hyper-heuristic strategy for t-way test suite generation, Applied Soft Computing, Volume 44, July 2016, Pages 57-74, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2016.03.021. (https://www.sciencedirect.com/science/article/pii/S1568494616301302) Abstract: Abstract This paper proposes a novel hybrid t-way test generation strategy (where t indicates interaction strength), called High Level Hyper-Heuristic (HHH). HHH adopts Tabu Search as its high level meta-heuristic and leverages on the strength of four low level meta-heuristics, comprising of Teaching Learning based Optimization, Global Neighborhood Algorithm, Particle Swarm Optimization, and Cuckoo Search Algorithm. HHH is able to capitalize on the strengths and limit the deficiencies of each individual algorithm in a collective and synergistic manner. Unlike existing hyper-heuristics, HHH relies on three defined operators, based on improvement, intensification and diversification, to adaptively select the most suitable meta-heuristic at any particular time. Our results are promising as HHH manages to outperform existing t-way strategies on many of the benchmarks. Keywords: Software testing; t-way Testing; Hyper-heuristic; Particle Swarm Optimization; Cuckoo Search Algorithm; Teaching Learning based Optimization; Global Neighborhood Algorithm Ural Erdemir, Feza Buzluca, A learning-based module extraction method for object-oriented systems, Journal of Systems and Software, Volume 97, November 2014, Pages 156-177, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2014.07.038. (https://www.sciencedirect.com/science/article/pii/S0164121214001599) Abstract: Abstract Developers apply object-oriented (OO) design principles to produce modular, reusable software. Therefore, service-specific groups of related software classes called modules arise in OO systems. Extracting the modules is critical for better software comprehension, efficient architecture recovery, determination of service candidates to migrate legacy software to a service-oriented architecture, and transportation of such services to cloud-based distributed systems. In this study, we propose a novel approach to automatic module extraction to identify services in OO software systems. In our approach, first we create a weighted and directed graph of the software system in which vertices and edges represent the classes and their relations, respectively. Then, we apply a clustering algorithm over the graph to extract the modules. We calculate the weight of an edge by considering its probability of being within a module or between modules. To estimate these positional probabilities, we propose a machine-learning-based classification system that we train with data gathered from a real-world OO reference system. We have implemented an automatic module extraction tool and evaluated the proposed approach on several open-source and industrial projects. The experimental results show that the proposed approach generates highly accurate decompositions that are close to authoritative module structures and outperforms existing methods. Keywords: Software architecture recovery; Software modularization; SOA Wee Wee Sim, Peggy S. Brouse, Empowering Requirements Engineering Activities with Personas, Procedia Computer Science, Volume 28, 2014, Pages 237-246, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2014.03.030. (https://www.sciencedirect.com/science/article/pii/S1877050914000933) Abstract: Abstract This paper examines the concept of Persona, a concept which has been used by the Human-Computer Interaction (HCL) field for gathering and communicating information about users, by attempting to integrate the Persona concept into the requirements engineering process and investigate its relationships with the concepts of viewpoints, scenarios, tasks, goals, and requirements in the context of a web application domain. A Concept Development Process (CDP) model is proposed to help guide engineers, analysts, and developers in the development of the persona concept and its integration into the requirements engineering process. The objectives of the proposed CDP model are 1) to enhance the requirements engineering process by incorporating the persona concept into the requirements engineering activities to enable engineers, analysts, and developers gain a better understanding of users’ needs and behaviors early in the requirements engineering process and 2) to identify missing requirements early in the requirements engineering process by examining the relationships of personas with scenarios, tasks, goals, and requirements applied in a web application domain. An online course registration system is used as a case study example in the CDP model. Keywords: Persona; Requirements Engineering; Requirements Elicitation; Requirements Analysis; User Modeling Wilson Rosa, Travis Packard, Abishek Krupanand, James W. Bilbro, Max M. Hodal, COTS integration and estimation for ERP, Journal of Systems and Software, Volume 86, Issue 2, February 2013, Pages 538-550, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2012.09.030. (https://www.sciencedirect.com/science/article/pii/S0164121212002713) Abstract: This paper presents a comprehensive set of effort and schedule estimating models for predicting Enterprise Resource Planning (ERP) implementations, available in the open literature. The first set of models uses product size to predict ERP software engineering effort as well as total integration effort. Product size is measured in terms of the number of report, interface, conversion, and extension (RICE) objects configured and customized within the commercial ERP tool. Total integration effort captures software engineering plus systems engineering, program management, change management, development test & evaluation, and training development. The second set of models predicts the duration of ERP implementation stages in terms of RICE objects, staffing, and the number of test cases. The statistical models are based on data collected from 20 programs implemented within the federal government over the course of nine years beginning in 2000. The data was collected during the time period from 2006 to 2010. The models focus on the vendor's implementation team, and therefore should be applicable to commercial ERP implementations. Finally, ERP adopters/customers can use these models to validate Vendor's Implementation Team cost proposals or estimates. Keywords: Enterprise Resource Planning; Effort estimation; Cost model; Schedule estimation; Software engineering Eltjo R. Poort, Hans van Vliet, RCDA: Architecting as a risk- and cost management discipline, Journal of Systems and Software, Volume 85, Issue 9, September 2012, Pages 1995-2013, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2012.03.071. (https://www.sciencedirect.com/science/article/pii/S0164121212000994) Abstract: We propose to view architecting as a risk- and cost management discipline. This point of view helps architects identify the key concerns to address in their decision making, by providing a simple, relatively objective way to assess architectural significance. It also helps business stakeholders to align the architect's activities and results with their own goals. We examine the consequences of this point of view on the architecture process. The point of view is the basis of RCDA, the Risk- and Cost Driven Architecture approach. So far, more than 150 architects have received RCDA training. For a majority of the trainees, RCDA has a significant positive impact on their architecting work. Keywords: Software architecture; Risk Management; Cost management Todd Sedano, Cécile Péraire, Jason Lohn, Towards Generating Essence Kernels Using Genetic Algorithms, Procedia Computer Science, Volume 62, 2015, Pages 55-64, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2015.08.410. (https://www.sciencedirect.com/science/article/pii/S1877050915025454) Abstract: Abstract The Software Engineering Method and Theory (SEMAT) community created the Essence kernel as a unifying framework for describing and analyzing software engineering endeavors. The Essence kernel is based upon human experience and judgment, not empirical data. Background At Carnegie Mellon University in Silicon Valley, we have collected data from masters of science in software engineering students as they complete a team-based project course as their capstone or practicum project using the Essence kernel. Each week, the team recorded their progress in an Essence Reflection meeting. This data serves as training data for evaluating the Essence kernel and alternative candidate kernels. Objective Generate candidate replacement kernels by using a fitness function based on empirical data. Method Using genetic programming, the kernel genotype is represented as a collection of linear state machines each with a collection of unique checklist items. Operations to evolve the genotypes include randomly moving checklist items, splitting states, and deleting states by moving their checklist items to other states. Results Genetic programming created random candidate essence kernels that scored higher fitness scores than the original essence kernel. The purpose of this exploratory work is to demonstrate one way to generate a candidate Essence kernel directly from empirical data, not to recommend a replacement for the original Essence kernel. Reducing the Essence kernel from seven alphas to one alpha results in higher fitness scores. Limitations Given the limited amount of data, the generated kernels may be over-optimized. Additional empirical data is required before recommending replacing the original kernel with a candidate kernel that fits the data. Conclusion The original Essence kernel is highly structured around human notions of order. Genetic algorithms can generate candidate kernels that humans might not normally consider. Based on the analysis of the fitness function, a kernel with a fundamentally different structure might more effectively recommend next steps for a team during Essence Reflection Meetings. Keywords: Essence Kernel; Genetic Algorithms; Empirical Research Thomas B Hilburn, Greg Hislop, Donald J Bagert, Michael Lutz, Susan Mengel, Michael McCracken, Guidance for the development of software engineering education programs, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 163-169, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00092-8. (https://www.sciencedirect.com/science/article/pii/S0164121299000928) Abstract: In this paper, we discuss issues and ideas that can improve the undergraduate education of software engineers. We submit that a key impediment to the advancement of software engineering education is the lack of guidance and support for the development of new courses and curricula. We discuss the work and results of a project to create a set of Guidelines for Software Engineering Education. We outline the content of the Guidelines, describe how they relate to recent and current professional activities to improve the practice of software engineering, and discuss future plans for their development. Keywords: Software engineering; Computing curricula; Curriculum design Nasser Giacaman, Oliver Sinnen, Preparing the software engineer for a modern multi-core world, Journal of Parallel and Distributed Computing, Available online 8 March 2018, ISSN 0743-7315, https://doi.org/10.1016/j.jpdc.2018.02.028. (https://www.sciencedirect.com/science/article/pii/S0743731518301060) Abstract: Abstract Parallel and Distributed Computing (PDC) was traditionally viewed as an advanced subject reserved for elective graduate courses. The last decade has seen two areas with rapid growth, whose synergy is demanding new skills for software engineers in a modern multi-core world. The first has been society’s increasing demand for software engineering solutions, evident in the integral role that software plays in daily life. Unlike traditional PDC applications in the scientific and engineering domains, modern software applications are interacting directly with millions of users on mainstream laptops, smartphones and tablets. The second trend is that of multi-core processors powering such devices, which is further fueling the potential of software applications. This paper proposes a Modern Parallel Programming Framework that recognizes that successful software engineering in this domain involves a combination of hard skills and soft skills. A course dedicated to this goal is presented and evaluated, incorporating a research-infused, problem-based and active learning approach. Keywords: Software engineering; Concurrency; Parallel programming; Active learning; Research-infused learning; Problem-based learning; Soft skills; Graphical user interfaces; Object-oriented programming Jin Liu, Juan Li, Xiaoping Sun, Yuan Xie, Jeff Lei, Qiping Hu, An Embedded Co-AdaBoost based construction of software document relation coupled resource spaces for cyber–physical society, Future Generation Computer Systems, Volume 32, March 2014, Pages 198-210, ISSN 0167-739X, https://doi.org/10.1016/j.future.2012.12.017. (https://www.sciencedirect.com/science/article/pii/S0167739X12002373) Abstract: Software is a very important means of achieving the vision of the cyber–physical society. Software document relation coupled Resource Spaces prompts the cyber–physical society by facilitating the reuse of software design knowledge. The establishment of software document relation coupled Resource Spaces faces the scarcity of labeled data that helps discovering software document relations between resources dwelling in different Resource Spaces. This paper proposes the Embedded Co-AdaBoost algorithm to overcome this challenge by making the best use of easily available unlabeled data, integrating multi-view learning into the AdaBoost and leveraging the advantages of Co-training for performance enhancement. Compared with conventional AdaBoost, the experiment illustrates the effectiveness of the Embedded Co-AdaBoost in the convergence rate, the accuracy and the steady performance. The empirical experience demonstrates the ability of the Embedded Co-AdaBoost in prompting the development of software document relation coupled Resource Spaces. Keywords: Embedded Co-AdaBoost; Software document classification; Software document relation; Coupled resource spaces James E. Tomayko, Is software engineering graduate-level material?, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 231-233, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90068-X. (https://www.sciencedirect.com/science/article/pii/016412128990068X) Abstract: Since the first software engineering degree programs began to appear in the late 1970's, the subject has largely been taught at the graduate level, with an increasing number of one- and two-term introductory courses offered for upperdivision undergraduate students. The last curriculum suggested by the Association for Computing Machinery in 1978 for undergraduate computer science degree programs designates such a course as an elective, and most degrees are granted to students who have not taken it. Therefore, many persons employed today as de facto software engineers gain knowledge of software engineering either through trial and error on the job, or use that “method” coupled with part-time graduate study if they are lucky enough to work near a university offering such a program. Since there are so few schools that explicitly structure their undergraduate courses to incorporate software engineering principles and techniques, the existence of software engineering at the graduate level is thus essentially by default, since such courses taken by post-bachelors need to be at the graduate level for them to receive “proper credit.” Gillian R. Hayes, Charlotte P. Lee, Paul Dourish, Organizational routines, innovation, and flexibility: The application of narrative networks to dynamic workflow, International Journal of Medical Informatics, Volume 80, Issue 8, August 2011, Pages e161-e177, ISSN 1386-5056, https://doi.org/10.1016/j.ijmedinf.2011.01.005. (https://www.sciencedirect.com/science/article/pii/S1386505611000281) Abstract: Objective The purpose of this paper is to demonstrate how current visual representations of organizational and technological processes do not fully account for the variability present in everyday practices. We further demonstrate how narrative networks can augment these representations to indicate potential areas for successful or problematic adoption of new technologies and potential needs for additional training. Methods We conducted a qualitative study of the processes and routines at a major academic medical center slated to be supported by the development and installation of a new comprehensive HIT system. We used qualitative data collection techniques including observations of the activities to be supported by the new system and interviews with department heads, researchers, and both clinical and non-clinical staff. We conducted a narrative network analysis of these data by choosing exemplar processes to be modeled, selecting and analyzing narrative fragments, and developing visual representations of the interconnection of these narratives. Results Narrative networks enable us to view the variety of ways work has been and can be performed in practice, influencing our ability to design for innovation in use. Discussion Narrative networks are a means for analyzing and visualizing organizational routines in concert with more traditional requirements engineering, workflow modeling, and quality improvement outcome measurement. This type of analysis can support a deeper and more nuanced understanding of how and why certain routines continue to exist, change, or stop entirely. At the same time, it can illuminate areas in which adoption may be slow, more training or communication may be needed, and routines preferred by the leadership are subverted by routines preferred by the staff. Keywords: Collaborative systems; Workflow analysis; Narrative networks O. Akanyeti, T. Kyriacou, U. Nehmzow, R. Iglesias, S.A. Billings, Visual task identification and characterization using polynomial models, Robotics and Autonomous Systems, Volume 55, Issue 9, 30 September 2007, Pages 711-719, ISSN 0921-8890, https://doi.org/10.1016/j.robot.2007.05.016. (https://www.sciencedirect.com/science/article/pii/S0921889007000632) Abstract: Developing robust and reliable control code for autonomous mobile robots is difficult, because the interaction between a physical robot and the environment is highly complex, subject to noise and variation, and therefore partly unpredictable. This means that to date it is not possible to predict robot behaviour based on theoretical models. Instead, current methods to develop robot control code still require a substantial trial-and-error component to the software design process. This paper proposes a method of dealing with these issues by (a) establishing task-achieving sensor-motor couplings through robot training, and (b) representing these couplings through transparent mathematical functions that can be used to form hypotheses and theoretical analyses of robot behaviour. We demonstrate the viability of this approach by teaching a mobile robot to track a moving football and subsequently modelling this task using the NARMAX system identification technique. Keywords: Autonomous mobile robots; System identification; Polynomials Donald J. Bagert, Susan A. Mengel, Developing and using a web-based project process throughout the software engineering curriculum, Journal of Systems and Software, Volume 74, Issue 2, 15 January 2005, Pages 113-120, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2003.09.024. (https://www.sciencedirect.com/science/article/pii/S0164121203002917) Abstract: In order to facilitate the study and use of software process, which is essential to the education of future software professionals, a standard and tailorable project process has been developed over the last five years at Texas Tech University for use in both undergraduate and graduate curricula, with a total of 12 courses involved. The process is entirely web-based, and includes a complete set of HTML document templates in order to facilitate the creation of project artifacts which are posted to the course web page. This method enhances communication between team members, including distance education students, and between the project team and client. The project process has received positive feedback from all stakeholders involved. This paper discusses the benefits of the web-based project process, its relation to curriculum models, and plans for a more formal assessment of the process. The portability of process to other institutions is also discussed, with an example provided involving the software engineering courses at the Rose-Hulman Institute of Technology. Benjamin Turnbull, Suneel Randhawa, Automated event and social network extraction from digital evidence sources with ontological mapping, Digital Investigation, Volume 13, June 2015, Pages 94-106, ISSN 1742-2876, https://doi.org/10.1016/j.diin.2015.04.004. (https://www.sciencedirect.com/science/article/pii/S1742287615000444) Abstract: Abstract The sharp rise in consumer computing, electronic and mobile devices and data volumes has resulted in increased workloads for digital forensic investigators and analysts. The number of crimes involving electronic devices is increasing, as is the amount of data for each job. This is becoming unscaleable and alternate methods to reduce the time trained analysts spend on each job are necessary. This work leverages standardised knowledge representations techniques and automated rule-based systems to encapsulate expert knowledge for forensic data. The implementation of this research can provide high-level analysis based on low-level digital artefacts in a way that allows an understanding of what decisions support the facts. Analysts can quickly make determinations as to which artefacts warrant further investigation and create high level case data without manually creating it from the low-level artefacts. Extraction and understanding of users and social networks and translating the state of file systems to sequences of events are the first uses for this work. A major goal of this work is to automatically derive ‘events’ from the base forensic artefacts. Events may be system events, representing logins, start-ups, shutdowns, or user events, such as web browsing, sending email. The same information fusion and homogenisation techniques are used to reconstruct social networks. There can be numerous social network data sources on a single computer; internet cache can locate Facebook, LinkedIn, Google Plus caches; email has address books and copies of emails sent and received; instant messenger has friend lists and call histories. Fusing these into a single graph allows a more complete, less fractured view for an investigator. Both event creation and social network creation are expected to assist investigator-led triage and other fast forensic analysis situations. Keywords: Artificial intelligence; Big data; Digital forensics; Digital evidence; Event representation; Forensic tool development; Knowledge representation; Ontology; Software engineering; Triage Susan Ferreira, Misagh Faezipour, An Analysis of Processes, Risks, and Best Practices for Use in Developing Systems Engineering Process Simulators, Procedia Computer Science, Volume 8, 2012, Pages 87-92, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2012.01.018. (https://www.sciencedirect.com/science/article/pii/S1877050912000191) Abstract: Systems engineering simulation models provide a valuable means to understand and learn important concepts related to systems engineering. Simulators can be used for education and as decision support systems in order to evaluate the dynamic consequences of various courses of action. This paper examines existing research mined from literature related to systems and software engineering processes, risks, and best practices. It seeks to guide the definition of priorities for systems engineering process simulator development. The paper identifies an initial set of research opportunities for the development of systems engineering process simulators. Keywords: Systems Engineering Simulator; Process; Risk; Best Practices; Systems Engineering Process Simulation Christopher J. Vincent, Ann Blandford, Usability standards meet scenario-based design: Challenges and opportunities, Journal of Biomedical Informatics, Volume 53, February 2015, Pages 243-250, ISSN 1532-0464, https://doi.org/10.1016/j.jbi.2014.11.008. (https://www.sciencedirect.com/science/article/pii/S153204641400238X) Abstract: Abstract The focus of this paper is on the challenges and opportunities presented by developing scenarios of use for interactive medical devices. Scenarios are integral to the international standard for usability engineering of medical devices (IEC 62366:2007), and are also applied to the development of health software (draft standard IEC 82304-1). The 62366 standard lays out a process for mitigating risk during normal use (i.e. use as per the instructions, or accepted medical practice). However, this begs the question of whether “real use” (that which occurs in practice) matches “normal use”. In this paper, we present an overview of the product lifecycle and how it impacts on the type of scenario that can be practically applied. We report on the development and testing of a set of scenarios intended to inform the design of infusion pumps based on “real use”. The scenarios were validated by researchers and practitioners experienced in clinical practice, and their utility was assessed by developers and practitioners representing different stages of the product lifecycle. These evaluations highlighted previously unreported challenges and opportunities for the use of scenarios in this context. Challenges include: integrating scenario-based design with usability engineering practice; covering the breadth of uses of infusion devices; and managing contradictory evidence. Opportunities included scenario use beyond design to guide marketing, to inform purchasing and as resources for training staff. This study exemplifies one empirically grounded approach to communicating and negotiating the realities of practice. Keywords: User-Computer interface; Multidisciplinary communication; Medical device design; Software design Perry Alexander, Integrating formalism into undergraduate software engineering, Journal of Systems and Software, Volume 74, Issue 2, 15 January 2005, Pages 147-154, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2003.09.027. (https://www.sciencedirect.com/science/article/pii/S0164121203002942) Abstract: This paper describes an approach and rational for using logic and formal methods in undergraduate software engineering education. Formal methods and logic provide a mathematical basis for modeling software analogous to the role of continuous mathematics in traditional engineering disciplines. Traditional software engineering techniques provide means for modeling software development processes and structuring specifications. Neither formal methods nor traditional approaches subsume the other, but are complimentary in software engineering education and practice. The course described here was a part of the standard Computer Engineering curriculum at The University of Cincinnati from 1993 through 1999. This paper reports on the course and observations over six years of teaching the course to undergraduate and graduate students. James H. Gerlach, Feng-Yang Kuo, Formal development of hybrid user-computer interfaces with advanced forms of user assistance, Journal of Systems and Software, Volume 16, Issue 3, November 1991, Pages 169-183, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(91)90012-U. (https://www.sciencedirect.com/science/article/pii/016412129190012U) Abstract: This article presents a software engineering approach to developing a hybrid interface consisting of both command language and direct manipulation. This approach supports development of reusable and maintainable interface software. In addition, it includes provisions for satisfying human factors. These provisions include multiple interface styles that are functionally equivalent, training wheels for error containment, and multithread dialogs for user advisories. Details of the interface architecture are presented along with a prototype of an operating system. Nauman Bin Ali, Kai Petersen, Claes Wohlin, A systematic literature review on the industrial use of software process simulation, Journal of Systems and Software, Volume 97, November 2014, Pages 65-85, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2014.06.059. (https://www.sciencedirect.com/science/article/pii/S0164121214001502) Abstract: AbstractContext Software process simulation modelling (SPSM) captures the dynamic behaviour and uncertainty in the software process. Existing literature has conflicting claims about its practical usefulness: SPSM is useful and has an industrial impact; SPSM is useful and has no industrial impact yet; SPSM is not useful and has little potential for industry. Objective To assess the conflicting standpoints on the usefulness of SPSM. Method A systematic literature review was performed to identify, assess and aggregate empirical evidence on the usefulness of SPSM. Results In the primary studies, to date, the persistent trend is that of proof-of-concept applications of software process simulation for various purposes (e.g. estimation, training, process improvement, etc.). They score poorly on the stated quality criteria. Also only a few studies report some initial evaluation of the simulation models for the intended purposes. Conclusion There is a lack of conclusive evidence to substantiate the claimed usefulness of SPSM for any of the intended purposes. A few studies that report the cost of applying simulation do not support the claim that it is an inexpensive method. Furthermore, there is a paramount need for improvement in conducting and reporting simulation studies with an emphasis on evaluation against the intended purpose. Keywords: Software process simulation; Systematic literature review; Evidence based software engineering M Karpenko, N Sepehri, Neural network classifiers applied to condition monitoring of a pneumatic process valve actuator, Engineering Applications of Artificial Intelligence, Volume 15, Issues 3–4, June–August 2002, Pages 273-283, ISSN 0952-1976, https://doi.org/10.1016/S0952-1976(02)00068-4. (https://www.sciencedirect.com/science/article/pii/S0952197602000684) Abstract: As modern process plants become more complex, the ability to detect and identify the faulty operation of pneumatic control valves is becoming increasingly important. In this work, a neural network pattern classifier is employed to carry out fault diagnosis and identification upon the actuator of a Fisher–Rosemount 667 industrial process valve. The network is trained with experimental data obtained directly from a software package that comes with the valve. This has eliminated the need for additional instrumentation of the valve. Using this software, tests are carried out to obtain experimental parameters associated with the valve performance for incorrect supply pressure, diaphragm leakage, and vent blockage faults. Specifically, the valve signature and dynamic error band tests are used to directly obtain lower and upper bench sets, minimum, maximum, and average dynamic errors, as well as the dynamic linearity. Additionally, valve deadband and hysteresis are measured graphically from the available valve signature plots for each faulty condition. The relationships between these parameters, for each fault, form signatures that are subsequently learned by a multilayer feedforward network trained by error back-propagation. The test results show that the resulting network has the ability to detect and identify various magnitudes of each fault. It is also observed that a smaller network with a shorter training time results when the valve deadband and hysteresis are included in the training data. Thus, the extra effort required to extract these parameters from the valve signature plots is justified. Keywords: Condition monitoring; Fault detection and identification; Fault diagnosis; Pneumatic process valves; Neural networks; Pattern classifiers Tobias Teich, Falko Roessler, Daniel Kretz, Susan Franke, Design of a Prototype Neural Network for Smart Homes and Energy Efficiency, Procedia Engineering, Volume 69, 2014, Pages 603-608, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2014.03.032. (https://www.sciencedirect.com/science/article/pii/S1877705814002781) Abstract: Abstract As a part of smart homes, a subsystem consisting of three components including a neural network is designed to provide personalized services. Unique factor combinations of building specifics, user profiles and external influences lead to the necessity of self-adaptive systems for personal comfort. The system supports room temperature control in order to heat rooms energy- efficiently at a set time. Smart home systems require a software architecture that allows services to be deployed on virtual and hardware devices.The design of automated processes is the first step of later programming and implementation into smart home systems that will automatically supervise and re-train its components and will also allow live feedback. Keywords: Neural networks; learning systems; energy efficiency; smart homes Ricardo de A. Araújo, Adriano L.I. Oliveira, Sergio Soares, A shift-invariant morphological system for software development cost estimation, Expert Systems with Applications, Volume 38, Issue 4, April 2011, Pages 4162-4168, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2010.09.078. (https://www.sciencedirect.com/science/article/pii/S0957417410010353) Abstract: This work presents a shift-invariant morphological system to solve the problem of software development cost estimation (SDCE). It consists of a hybrid morphological model, which is a linear combination between a morphological-rank (MR) operator (nonlinear) and a Finite Impulse Response (FIR) operator (linear), referred to as morphological-rank-linear (MRL) filter. A gradient steepest descent method to adjust the MRL filter parameters (learning process), using the Least Mean Squares (LMS) algorithm, and a systematic approach to overcome the problem of non-differentiability of the morphological-rank operator are used to improve the numerical robustness of the training algorithm. Furthermore, an experimental analysis is conducted with the proposed system using the NASA software project database, and in the experiments, two relevant performance metrics and an evaluation function are used to assess its performance. The results obtained are compared to models recently presented in literature, showing superior performance of this kind of morphological systems for the SDCE problem. Keywords: Software development cost estimation; Software effort prediction; Software engineering; Mathematical morphology; Shift-invariant morphological systems; Morphological-rank-linear models L.A. Gilbert, Microelectronics in education: two types of innovation, two strategies, International Journal of Man-Machine Studies, Volume 17, Issue 1, July 1982, Pages 3-14, ISSN 0020-7373, https://doi.org/10.1016/S0020-7373(82)80003-5. (https://www.sciencedirect.com/science/article/pii/S0020737382800035) Abstract: The extent to which education has adopted and made effective use of technology is not commensurate with the prospective benefits. Mounting evidence indicates that good hardware and software design and implementation in the technical sense are unlikely in themselves to ensure successful and continued use. Other factors have been identified as relevant to technological innovation. Some of these, in such fields of social psychology and sociology, may require actions outside the immediate scope of the technologist, concerned with such issues as curriculum development, teacher training, information and advice, and resource support. The United Kingdom and France, for example, have embarked upon fundamentally similar national strategies for the introduction of microelectronic technology into their educational systems. Success may depend upon the extent to which educational institutions are capable of integrating information technology, based upon microelectronics, into processes which have hitherto been dominated by the use of print for historical reasons. Adult and informal learners are increasingly using information technology outside traditional educational environments, and an extensive growth of open and distant learning networks can be foreseen, combining educational institutions with other facilities and services. A second type of innovative strategy will therefore be required, concerned largely with the development of technological systems that are suitable for use by independent learners. This calls for agencies that can identify user needs and mediate them to providers of systems which may at the moment be oriented towards non-educational users. John Fox, Nicky Johns, Ali Rahmanzadeh, Disseminating medical knowledge: the PROforma approach, Artificial Intelligence in Medicine, Volume 14, Issues 1–2, September–October 1998, Pages 157-182, ISSN 0933-3657, https://doi.org/10.1016/S0933-3657(98)00021-9. (https://www.sciencedirect.com/science/article/pii/S0933365798000219) Abstract: Medical knowledge is traditionally disseminated via the publication of documents and through participation in clinical practice. Information technology offers to extend both modes of dissemination, via electronic publishing and virtual reality training, for example. AI promises even more radical changes through the possibility of publishing clinical expertise in the form of expert systems, which assist patient care through active decision support and workflow management. PROforma is a knowledge representation language that is designed to support this new mode of dissemination. It is based on an intuitive model of the processes of care and well-understood logical semantics. This paper provides a description of the language and associated software tools, and discusses its potential roles in, and implications for, medical knowledge publishing. Keywords: Medicine; Knowledge representation; Proformalisation; Software engineering; Formal methods; Decision making; Planning; Uncertainty Charles Ume, Marc Timmerman, George Graham, Dan Ezenekwe, Microprocessor design: An integrated multidisciplinary approach, Mechatronics, Volume 3, Issue 1, February 1993, Pages 77-87, ISSN 0957-4158, https://doi.org/10.1016/0957-4158(93)90039-5. (https://www.sciencedirect.com/science/article/pii/0957415893900395) Abstract: Traditional manufacturing-oriented education in the field of mechanical engineering includes topics like moulding, casting, metallurgy, special materials and machining. For a mechanical engineering graduate to be prepared to face the modern computerized assembly line or to design a microprocessor based product, microprocessor education is essential. This paper presents a brief summary of the microprocessor graduate and undergraduate laboratories and lectures as taught at the George W. Woodruff School of Mechanical Engineering of the Georgia Institute of Technology. The approach taken emphasizes an integrated approach in which hardware design, software design, and traditional mechanical engineering topics are combined in the instructional program. Four final projects drawn from the class archives illustrate this integrated approach to engineering design and manufacturing education. Wasfi G. Al-Khatib, Omran Bukhres, Patricia Douglas, An empirical study of skills assessment for software practitioners, Information Sciences - Applications, Volume 4, Issue 2, September 1995, Pages 83-118, ISSN 1069-0115, https://doi.org/10.1016/1069-0115(95)90014-4. (https://www.sciencedirect.com/science/article/pii/1069011595900144) Abstract: Software professionals face the difficult challenge of keeping up with today's fast-paced technological environment. There has been much discussion about technical obsolescence in a field where the half-life of an undergraduate education is only a few years. Moreover, assessments provide measurable proof of behavioral changes, legitimizing the human resource department's role in improving productivity by rendering it quantifiable. In this paper, we describe an empirical study of the skills assessment of software practitioners. This study is based on a survey performed collaboratively by the Software Engineering Research Center (SERC), Purdue University, and IBM Training and Education, with direct participation from the IEEE. The goal of this research was the determination of the critical skills necessary for software professionals. This paper describes the survey, the structure of the questionnaire, and the skills assessment process. Skills assessment stages such as data collection, data analysis, data representation, and follow-up reassessment are also described. Detailed results of the survey and selected critical skills relating to both object-oriented and client-server technologies are presented in this paper. These assessments provide a systematic approach through which human resources departments can improve productivity during downturns by increasing the working effectiveness of software developers. We believe that university software engineering students must understand the differences between academic programming and industry software development and engineering. They must also be able to perform the activities involved with plan development, project management, and software product evaluation. We also conclude that these assessments will foster genuine commitment and motivate software practitioners to grow in a field of technology that changes daily. Giovanny Arbelaez Garces, Auguste Rakotondranaivo, Eric Bonjour, An acceptability estimation and analysis methodology based on Bayesian networks, International Journal of Industrial Ergonomics, Volume 53, May 2016, Pages 245-256, ISSN 0169-8141, https://doi.org/10.1016/j.ergon.2016.02.005. (https://www.sciencedirect.com/science/article/pii/S0169814116300099) Abstract: Abstract As companies are forced to conceive innovative products to stay competitive, designers face the challenge of developing products more suited to users' needs and perceptions in order to be accepted, thus reducing project risk failure. Evaluating users' acceptability has become an important research problem. Current approaches leave the acceptance evaluation question to be answered in the last stages of product development process (NPD), when an almost finished prototype is available and when there is no time left for important modifications. Acceptability evaluation methods suitable for use from the early stages of the NPD process are thus needed. This paper proposes a method for acceptability evaluation and analysis that can be used in the early stages of the development cycle. It is based on the evaluation of the solution concept by the users. The relationships among the factors (or criteria) are made explicit, thus helping designers to identify the key factors for acceptance. As the users' tests and the maturity of the concept prototype are limited in this stage, the proposed method exploits the inference properties of Bayesian networks making it possible to make useful estimations and allowing the exploration of actions that could improve the product acceptability level. Two case studies are presented in order to illustrate the method, the first related to a technological product design for a home-health care service provider and the second to a work-related musculoskeletal disorder prevention software design. Relevance to industry The article describes an acceptability assessment and an analysis approach to be used by industrial engineers, designers and ergonomists in the early phases of design projects. The method can help the design team to identify the levers (key factors) for enhancing product acceptance and to identify different actions (e.g. product modification, deployment strategy, and training). Keywords: Acceptability evaluation; Product design; Innovation; Bayesian networks; User preferences Daniel E. Schaerer, Helmut Schauer, List and graph algorithms in Object Pascal, Journal of Microcomputer Applications, Volume 14, Issue 3, July 1991, Pages 229-261, ISSN 0745-7138, https://doi.org/10.1016/0745-7138(91)90014-I. (https://www.sciencedirect.com/science/article/pii/074571389190014I) Abstract: This article presents a step-by-step introduction to an object-oriented implementation of list and graph data structures in Object Pascal, including examples of elementary as well as non-trivial algorithms on lists and graphs. It emphasizes granular, incremental design to promote ease of understanding. Issues of object encapsulation and software re-use are discussed as well. The article is intended as a guideline for teaching data structures and algorithms, with additional emphasis on software engineering, to mid-curriculum students. Subhas Chandra Misra, Vinod Kumar, Uma Kumar, Identifying some important success factors in adopting agile software development practices, Journal of Systems and Software, Volume 82, Issue 11, November 2009, Pages 1869-1890, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2009.05.052. (https://www.sciencedirect.com/science/article/pii/S016412120900123X) Abstract: Agile software development (ASD) is an emerging approach in software engineering, initially advocated by a group of 17 software professionals who practice a set of “lightweight” methods, and share a common set of values of software development. In this paper, we advance the state-of-the-art of the research in this area by conducting a survey-based ex-post-facto study for identifying factors from the perspective of the ASD practitioners that will influence the success of projects that adopt ASD practices. In this paper, we describe a hypothetical success factors framework we developed to address our research question, the hypotheses we conjectured, the research methodology, the data analysis techniques we used to validate the hypotheses, and the results we obtained from data analysis. The study was conducted using an unprecedentedly large-scale survey-based methodology, consisting of respondents who practice ASD and who had experience practicing plan-driven software development in the past. The study indicates that nine of the 14 hypothesized factors have statistically significant relationship with “Success”. The important success factors that were found are: customer satisfaction, customer collaboration, customer commitment, decision time, corporate culture, control, personal characteristics, societal culture, and training and learning. Keywords: Success factors; Agile software Sandeep Kumar, Research-oriented teaching of PDC topics in integration with other undergraduate courses at multiple levels: A multi-year report, Journal of Parallel and Distributed Computing, Volume 105, July 2017, Pages 92-104, ISSN 0743-7315, https://doi.org/10.1016/j.jpdc.2017.01.004. (https://www.sciencedirect.com/science/article/pii/S0743731517300102) Abstract: Abstract Parallel and distributed computing (PDC) is finding its usage from system, algorithms, and architecture perspectives in research and industries of many domains. Due to its ever-increasing applications and benefits, the need of skilled manpower in the area of PDC is also increasing. It is felt that if the basic knowledge taught through the core course of PDC is supplemented with the discussion of PDC topics in integration with other computer science courses, then it will not only provide the students with more opportunities to ‘think in parallel’, but will also motivate them to harness the best of PDC. In this paper, we present our experiences of performing research-oriented teaching of PDC topics in integration with other undergraduate courses since 2014 to 2016 spread over multiple semesters. The courses mainly include Software Engineering, Computer Networks, Computer Architecture, Network Programming, and Network based Laboratory taught to undergraduate level students of Computer Science and Engineering. Our integration plan is encouraged from the objectives of NSF/TCPP–IEEE core curriculum initiative on PDC. Most of these courses are compulsory courses for undergraduate students of our department. In addition to the goals of the respective courses, we tried to fulfill many pedagogical goals corresponding to the PDC course. The teaching-contents of these courses have been adapted to also cover the aspects of PDC. A well-defined methodology for selection of PDC topics for integration, selection of topics for laboratory projects and home assignments, conduct of examinations (mid-term/end-term), pre-post feedback and evaluation, and other related activities has been planned. The methodology is intended to fulfill the priorly set goals and to achieve the intended learning outcomes (ILOs). This paper presents the methodology used, detailed topics and integration plan of PDC topics along with corresponding bloom levels, ILOs, evaluation strategies, and performance evaluation based on the students’ feedback and statistical analysis. Success of integration has been validated by performing statistical analysis of students’ pre-post feedback and performance in examinations. Keywords: PDC; Software engineering; Computer Networks; Computer architecture; Network programming; Network based laboratory; NSF early adopter award P.E. Preece, S.A. Al Bader, J.L. Davila Pernia, J.A.C. Evans, M.K. Giller, The development of A 2-D graphical process flowsheet generator (PFG) and a piping and instrumentation generator (PIG), Computers & Chemical Engineering, Volume 11, Issue 3, 1987, Pages 279-289, ISSN 0098-1354, https://doi.org/10.1016/0098-1354(87)85009-3. (https://www.sciencedirect.com/science/article/pii/0098135487850093) Abstract: Two software packages PFG and PIG are described. They enable users to draw 2-D process flowsheet and piping and instrumentation diagrams in monochrome and colour using graphical input/output. Ikon and menu driven input techniques together with a pen-like stylus or mouse are exploited to generate a highly interactive environment which facilitates process design and analysis in much greater depth than was previously possible. Developed for undergraduate teaching the packages are also relevant to practising engineers and to those involved in software engineering for the process industries. PFG and PIG are written in FORTRAN 77 using GINO-F for drawing. They interface with commercial process flowsheeting simulators producing the topological equipment stream connection information from the screen display. Francisco Ortin, Jose Manuel Redondo, Jose Quiroga, Design and evaluation of an alternative programming paradigms course, Telematics and Informatics, Volume 34, Issue 6, September 2017, Pages 813-823, ISSN 0736-5853, https://doi.org/10.1016/j.tele.2016.09.014. (https://www.sciencedirect.com/science/article/pii/S0736585316301393) Abstract: Abstract The knowledge of the most common programming paradigms, and the basic abstractions provided by each paradigm, are competencies to be attained by Software Engineering undergraduate students. These abstractions also include the basis of concurrent and parallel programming, present in different programming paradigms. In an existing Software Engineering degree, these competencies were assigned to the Programming Technology and Paradigms course. We present the approach followed in the design of that course to teach object-oriented, functional, concurrent and parallel programming to second year undergraduate students with basic knowledge of Java. The time limitations of the course prevented us from using various programming languages. After analyzing different alternatives, we chose C# to teach the course. We describe the most important challenges faced and how we addressed them. The course success rate is slightly greater than the rest of courses in the same year and degree, while performance rates and average marks are analogous. There is no influence of age and gender on the final mark, but students retaking the course have significantly worse evaluation than those enrolled for the first time. The students’ self-evaluation revealed that the proposed course has a strong influence on the achievement of the expected learning outcomes, and their satisfaction with the course was significantly higher than with the rest of courses in the same degree. Keywords: Programming paradigms; Object-orientation; Functional programming; Concurrency; Parallelism; Meta-programming; Dynamic typing; C# Gunther Paul, Warwick Pearse, An international benchmark for the Australian OHS Body of Knowledge (BoK), Safety Science, Volume 81, January 2016, Pages 13-24, ISSN 0925-7535, https://doi.org/10.1016/j.ssci.2015.07.016. (https://www.sciencedirect.com/science/article/pii/S0925753515001800) Abstract: Abstract Benchmarking was used to compare the Australian SIA’s (Safety Institute of Australia) OHS BoK with three different approaches to systemize the knowledge that should be taught by universities. The Australian Health and Safety Professionals Alliance (HaSPA) Core Body of Knowledge for Generalist OHS Professionals was benchmarked against three other international bodies of knowledge, the German Ergonomic Society’s Body of Knowledge Ergonomics – Core Definition, Object Catalogue and Research Domains, the IEEE Computer Society Software Engineering Body of Knowledge and the American ‘Association of Schools of Public Health’ Master’s Degree in Public Health Core Competency Model. It was found that quality, structure and content of the OHS BoK ranked lowest when compared with the other benchmarked documents. The HaSPA body of knowledge was ranked poorly when compared to the German Ergonomic Society’s Body of Knowledge for Ergonomics, IEEE Computer Society Software Engineering Body of Knowledge and the American Association of Schools of Public Health Core Competency Model. Analysis and discussion of the HaSPA BoK is important given its use as an audit tool for tertiary education in Australia. Furthermore the International Network of Safety & Health Practitioner Organisations (INSHPO) is apparently promoting the Australian SIA’s OHS BoK as the basis of an international standard. Keywords: Body of knowledge; Ergonomics; Work health and safety; Safety Institute of Australia Ltd; Professional certification; University accreditation Iris Reinhartz-Berger, Arnon Sturm, Yair Wand, Comparing functionality of software systems: An ontological approach, Data & Knowledge Engineering, Volume 87, September 2013, Pages 320-338, ISSN 0169-023X, https://doi.org/10.1016/j.datak.2012.09.005. (https://www.sciencedirect.com/science/article/pii/S0169023X12001073) Abstract: Abstract Organizations can reduce the costs and enhance the quality of required software by adapting existing software systems. Software adaptation decisions often involve comparing alternatives on two criteria: (1) how well a system meets users' requirements and (2) the effort required for adapting the system. These criteria reflect two points of view — of users and of developers. Common to both views is the notion of functionality, which software developers have traditionally used for effort estimation utilizing concepts such as function points. However, users involved in selecting systems are not necessarily familiar with such concepts. We propose an approach for comparing software functionality from users' point of view. The approach employs ontological concepts to define functionality in terms of system behaviors. To evaluate whether or not the approach is also usable by software developers, we conducted an exploratory experiment. In the experiment, software engineering students ranked descriptions of software systems on the amount of changes needed to adapt the systems to given requirements. The results demonstrated that the ontological approach was usable after a short training and provided results comparable to ranking done by expert software developers. We also compared the ontological approach to a method which employed function point concepts. The results showed no statistically significant differences in performance, but there seemed to be an advantage to the ontological approach for cases that were difficult to analyze. Moreover, it took less time to apply the ontological approach than the function point-based approach, and the difference was statistically significant. Keywords: Software comparison; Variability management; Ontologies; Requirements engineering; Development effort estimation; Function point analysis Freeman L. Moore, Phillip R. Purvis, Training practicing software engineers at Texas instruments, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 253-260, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90071-X. (https://www.sciencedirect.com/science/article/pii/016412128990071X) Abstract: Much has been written about software engineering programs from the viewpoint of the academician, but do these programs really reflect the need of industry? This paper provides some insight into the needs of practicing software engineers at Texas Instruments who are developing software according to military specifications and requirements for embedded real-time systems. The needs of our environment are compared to the entering skills of a typical newhire, with the differences noted. These differences can be satisfied by internal training that covers all aspects of software engineering, from communicating with co-workers to understanding the system life cycle. Carmen Zannier, Mike Chiasson, Frank Maurer, A model of design decision making based on empirical results of interviews with software designers, Information and Software Technology, Volume 49, Issue 6, June 2007, Pages 637-653, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2007.02.010. (https://www.sciencedirect.com/science/article/pii/S0950584907000122) Abstract: Despite the impact of design decisions on software design, we have little understanding about how design decisions are made. This hinders our ability to provide design metrics, processes and training that support inherent design work. By interviewing 25 software designers and using content analysis and explanation building as our analysis technique, we provide qualitative and quantitative results that highlight aspects of rational and naturalistic decision making in software design. Our qualitative multi-case study results in a model of design decision making to answer the question: how do software designers make design decisions? We find the structure of the design problem determines the aspects of rational and naturalistic decision making used. The more structured the design decision, the less a designer considers options. Keywords: Design decision; Rational decision making; Naturalistic decision making; Interviewing Rashina Hoda, Annette Henderson, Shiree Lee, Bridget Beh, Jason Greenwood, Aligning technological and pedagogical considerations: Harnessing touch-technology to enhance opportunities for collaborative gameplay and reciprocal teaching in NZ early education, International Journal of Child-Computer Interaction, Volume 2, Issue 1, January 2014, Pages 48-59, ISSN 2212-8689, https://doi.org/10.1016/j.ijcci.2014.06.001. (https://www.sciencedirect.com/science/article/pii/S221286891400018X) Abstract: Abstract New Zealand early childhood education (ECE) aims to provide a mix of teacher and child-led learning. A non-prescriptive curriculum allows for broad and rich early years teaching and learning experiences, with teachers responsive to devising engaging activities to align with children’s diverse interests. However, such spontaneity presents an on-going challenge for teachers. Using a combination of Action Research, elements of User-Centered and Participatory Design, and Scrum software development approaches, we conducted a multi-disciplinary study which leveraged joint contributions of software engineers and experts, including practitioners (teachers), users (children and teachers), and domain experts (in ECE curriculum and pedagogy, and early childhood psychology). Examination of teacher–child interactions with our software demonstrated that our game was engaging, promoted collaborative gameplay (by promoting mutual awareness, opportunities for information, and equitable control) and supported reciprocal teaching (by aligning children’s interests with content knowledge). Finally, it opens new avenues for introducing research and pedagogy-informed interactive educational software in the NZ ECE domain. Brian Hanks, Empirical evaluation of distributed pair programming, International Journal of Human-Computer Studies, Volume 66, Issue 7, July 2008, Pages 530-544, ISSN 1071-5819, https://doi.org/10.1016/j.ijhcs.2007.10.003. (https://www.sciencedirect.com/science/article/pii/S1071581907001395) Abstract: Pair programming, in which two individuals share a single computer to collaboratively develop software, has been shown to have many benefits in industry and in education. One drawback of pair programming is its collocation requirement, which limits its use to situations where the partners can physically meet. A tool that supported distributed pair programming, in which the partners could pair from separate locations, would remove this impediment. This paper discusses the development and empirical evaluation of such a tool. A significant feature of this tool is the presence of a second cursor that supports gesturing. Students who used the tool in their introductory programming course performed as well as collocated students on their programming assignments and final exam. These students also spent less time working by themselves. They also felt that the gesturing feature was useful and used it regularly. Keywords: Distributed pair programming; Gesturing; Introductory programming; Empirical software engineering; Computer-supported cooperative work K. Vinay Kumar, V. Ravi, Mahil Carr, N. Raj Kiran, Software development cost estimation using wavelet neural networks, Journal of Systems and Software, Volume 81, Issue 11, November 2008, Pages 1853-1867, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2007.12.793. (https://www.sciencedirect.com/science/article/pii/S0164121208000071) Abstract: Software development has become an essential investment for many organizations. Software engineering practitioners have become more and more concerned about accurately predicting the cost and quality of software product under development. Accurate estimates are desired but no model has proved to be successful at effectively and consistently predicting software development cost. In this paper, we propose the use of wavelet neural network (WNN) to forecast the software development effort. We used two types of WNN with Morlet function and Gaussian function as transfer function and also proposed threshold acceptance training algorithm for wavelet neural network (TAWNN). The effectiveness of the WNN variants is compared with other techniques such as multilayer perceptron (MLP), radial basis function network (RBFN), multiple linear regression (MLR), dynamic evolving neuro-fuzzy inference system (DENFIS) and support vector machine (SVM) in terms of the error measure which is mean magnitude relative error (MMRE) obtained on Canadian financial (CF) dataset and IBM data processing services (IBMDPS) dataset. Based on the experiments conducted, it is observed that the WNN-Morlet for CF dataset and WNN-Gaussian for IBMDPS outperformed all the other techniques. Also, TAWNN outperformed all other techniques except WNN. Keywords: Software development effort; Software cost estimation; Wavelet neural networks; Threshold accepting based wavelet neural network Fritz H Grupe, Robert Urwiler, Narender K Ramarapu, Mehdi Owrang, The application of case-based reasoning to the software development process, Information and Software Technology, Volume 40, Issue 9, 15 September 1998, Pages 493-499, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(98)00072-X. (https://www.sciencedirect.com/science/article/pii/S095058499800072X) Abstract: This paper, supported by a commercial case-based reasoning tool, demonstrates a method by which case based reasoning can be applied to the business software development process. Requirements definition, effort estimation, software design, and troubleshooting, and maintenance processes are discussed in terms of candidacy for CBR technology. Proper planning for an adequate support infrastructure is stressed as well as clear expectation setting through ongoing training. CBR is explored as a mechanism for improving productivity and quality problems currently afflicting the corporate software development field. Keywords: Case-based reasoning; CBR; Software development; Function point analysis; Systems analysis; Systems design; Software reusability Jesús Sánchez Cuadrado, Javier Luis Cánovas Izquierdo, Jesús García Molina, Applying model-driven engineering in small software enterprises, Science of Computer Programming, Volume 89, Part B, 1 September 2014, Pages 176-198, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2013.04.007. (https://www.sciencedirect.com/science/article/pii/S0167642313001056) Abstract: Abstract Model-Driven Engineering (MDE) is increasingly gaining acceptance in the software engineering community, however its adoption by the industry is far from successful. The number of companies applying MDE is still very limited. Although several case studies and reports have been published on MDE adoption in large companies, experience reports on small enterprises are still rare, despite the fact that they represent a large part of the software companies ecosystem. In this paper we report on our practical experience in two transfer of technology projects on two small companies. In order to determine the degree of success of these projects we present some factors that have to be taken into account in transfer of technology projects. Then, we assess both projects analyzing these factors and applying some metrics to give hints about the potential productivity gains that MDE could bring. We also comment on some lessons learned. These experiences suggest that MDE has the potential to make small companies more competitive, because it enables them to build powerful automation tools at modest cost. We will also present the approach followed to train these companies in MDE, and we contribute the teaching material so that it can be used or adapted by others projects of this nature. Keywords: Model Driven Engineering; Experience report; Small companies; Incremental consistency Froduald Kabanza, Guy Bisson, Annabelle Charneau, Taek-Sueng Jang, Implementing tutoring strategies into a patient simulator for clinical reasoning learning, Artificial Intelligence in Medicine, Volume 38, Issue 1, September 2006, Pages 79-96, ISSN 0933-3657, https://doi.org/10.1016/j.artmed.2006.01.003. (https://www.sciencedirect.com/science/article/pii/S0933365706000194) Abstract: SummaryObjective This paper describes an approach for developing intelligent tutoring systems (ITS) for teaching clinical reasoning. Materials and methods Our approach to ITS for clinical reasoning uses a novel hybrid knowledge representation for the pedagogic model, combining finite state machines to model different phases in the diagnostic process, production rules to model triggering conditions for feedback in different phases, temporal logic to express triggering conditions based upon past states of the student's problem solving trace, and finite state machines to model feedback dialogues between the student and TeachMed. The expert model is represented by an influence diagram capturing the relationship between evidence and hypotheses related to a clinical case. Results This approach is implemented into TeachMed, a patient simulator we are developing to support clinical reasoning learning for a problem-based learning medical curriculum at our institution; we demonstrate some scenarios of tutoring feedback generated using this approach. Conclusion Each of the knowledge representation formalisms that we use has already been proven successful in different applications of artificial intelligence and software engineering, but their integration into a coherent pedagogic model as we propose is unique. The examples we discuss illustrate the effectiveness of this approach, making it promising for the development of complex ITS, not only for clinical reasoning learning, but potentially for other domains as well. Keywords: Intelligent tutoring systems; Clinical reasoning learning; Patient simulation Derk J. Kiewiet, René J. Jorna, Wout v. Wezel, Planners and their cognitive maps: An analysis of domain representations using multi dimensional scaling, Applied Ergonomics, Volume 36, Issue 6, November 2005, Pages 695-708, ISSN 0003-6870, https://doi.org/10.1016/j.apergo.2005.05.009. (https://www.sciencedirect.com/science/article/pii/S0003687005000967) Abstract: We expected planners working on the planning problems at a particular department for many years to have similar ideas about their work domain. However, we found the opposite when we analyzed the results of an extensive knowledge elicitation study we performed on 25 planners working in the rolling stock and rolling staff planning department of the Netherlands Railways (in Dutch: NS). The planners were asked to model their domain by means of card sorting and graph-positioning methods using domain-related objects and relations. When we applied multi dimensional scaling to the outcome of this study, we found large differences in the individual cognitive maps thus created. In this article, we describe the analyses and discuss our findings about planners’ cognitive maps. These maps may have implications for the organization of planning in general, for training of planners, and for software design. Robin Abraham, Martin Erwig, UCheck: A spreadsheet type checker for end users, Journal of Visual Languages & Computing, Volume 18, Issue 1, February 2007, Pages 71-95, ISSN 1045-926X, https://doi.org/10.1016/j.jvlc.2006.06.001. (https://www.sciencedirect.com/science/article/pii/S1045926X06000383) Abstract: Spreadsheets are widely used, and studies have shown that most end-user spreadsheets contain non-trivial errors. Most of the currently available tools that try to mitigate this problem require varying levels of user intervention. This paper presents a system, called UCheck, that detects errors in spreadsheets automatically. UCheck carries out automatic header and unit inference, and reports unit errors to the users. UCheck is based on two static analyses phases that infer header and unit information for all cells in a spreadsheet. We have tested UCheck on a wide variety of spreadsheets and found that it works accurately and reliably. The system was also used in a continuing education course for high school teachers, conducted through Oregon State University, aimed at making the participants aware of the need for quality control in the creation of spreadsheets. Keywords: Spreadsheet; Unit; Type; Automatic error detection; Debugging; End-user software engineering D.N. Wilson, Case: Guidelines for success, Information and Software Technology, Volume 31, Issue 7, September 1989, Pages 346-350, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(89)90155-9. (https://www.sciencedirect.com/science/article/pii/0950584989901559) Abstract: Information-systems managers have embraced one or more of a number of software engineering techniques over the last 15 years in their search for a development approach that guarantees a quality product with a minimum of effort from developers; the latest of these approaches is computer-aided software engineering (CASE) tools. The paper discusses CASE tools, looks at the information-systems track record with software engineering principles, and suggests four rules for successful implementation of CASE: implement sound methods and techniques first, develop a clear statement of requirements before acquiring CASE tools, ensure that there is management commitment, and provide adequate initial and ongoing training. Keywords: system development; computer-aided software engineering; CASE; CASE tools M. Gómez, J. Cervantes, User Interface Transition Diagrams for customer–developer communication improvement in software development projects, Journal of Systems and Software, Volume 86, Issue 9, September 2013, Pages 2394-2410, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2013.04.022. (https://www.sciencedirect.com/science/article/pii/S0164121213001003) Abstract: Abstract We formalize the definition and construction of the User Interface Transition Diagram (UITD) which is a modelling notation for the transitions between UI presentations and the necessary conditions to trigger these transitions. We show how the UITD is able to improve the communication between stakeholders in a software development project: Human–Computer Interaction specialists, Software Engineers and customers who have little or no training in specialized modelling notations. We compare the UITD with other existing similar modelling notations highlighting the features that are better expressed in the UITD. We also include a case study in order to show how the UITD can be helpful in different phases of a software development project. The understandability of the UITD was confirmed by means of a test where different types of potential users were involved. Keywords: User Interface Flow; Functional requirements specification; Modelling notation Nelson Morgan, James Beck, Phil Kohn, Jeff Bilmes, Eric Allman, Joachim Beer, The ring array processor: A multiprocessing peripheral for connectionist applications, Journal of Parallel and Distributed Computing, Volume 14, Issue 3, March 1992, Pages 248-259, ISSN 0743-7315, https://doi.org/10.1016/0743-7315(92)90067-W. (https://www.sciencedirect.com/science/article/pii/074373159290067W) Abstract: We have designed and implemented a Ring Array Processor (RAP) for fast implementation of our continuous speech recognition training algorithms, which are currently dominated by layered “neural” network calculations. The RAP is a multi-DSP system with a low-latency ring interconnection scheme using programmable gate array technology and a significant amount of local memory per node (4–16 Mbytes of dynamic memory and 256 Kbytes of fast static RAM). Theoretical peak performance is 128 MFLOPS/board. A working system with 20 nodes has been used for our research at rates of 200–300 million connections per second for probability evaluation, and at roughly 30–60 million connection updates per second for training. A fully functional system with 40 nodes has also been benchmarked at roughly twice these rates. While practical considerations such as workstation address space restrict current implementations to 64 nodes, the architecture scales to about 16,000 nodes. For problems with 2 units per processor, communication and control overhead would reduce peak performance on the error back-propagation algorithm to about 50% of a linear speedup. This report describes the motivation for the RAP and shows how the architecture matches the target algorithm. We further describe some of the key features of the hardware and software design. Michael J. Findler, Ritesh Kumar Chalawadi, Teaching STAMP: High Level Communication Design Concerns for a Domestic Robot, Procedia Engineering, Volume 179, 2017, Pages 52-60, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2017.03.095. (https://www.sciencedirect.com/science/article/pii/S1877705817312110) Abstract: Abstract This past semester, I taught Software Safety at the University of Houston Clear Lake using STAMP as the framework of understanding safety. The students were expected to address the safety design concerns of a research domestic robot our department is developing. This is a general software engineering degree, and as such, the students’ knowledge of the domestic robot domain was virtually non-existent. In a nutshell, what follows are the lessons learned.• After a general description and simple demo of a prototype robot, the students were told of an “accident” involving the domestic robot. The students were to interview the teacher and teaching assistant as eyewitnesses to the accident as accident investigators. • After learning the details of the accident, the students performed a CAST analysis of the accident. Although rudimentary, the students delivered a mini-accident investigation presentation and report. • Following the accident investigation, the students changed hats and became safety designers, using STPA as the tool to describe a high level design of the robot communication system. Presented here are their conclusions and commentary on the academic drill. Keywords: STAMP; domestic robot; teaching Zeeshan Muzaffar, Moataz A. Ahmed, Software development effort prediction: A study on the factors impacting the accuracy of fuzzy logic systems, Information and Software Technology, Volume 52, Issue 1, January 2010, Pages 92-109, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2009.08.001. (https://www.sciencedirect.com/science/article/pii/S0950584909001244) Abstract: Reliable effort prediction remains an ongoing challenge to software engineers. Traditional approaches to effort prediction such as the use of models derived from historical data, or the use of expert opinion are plagued with issues pertaining to their effectiveness and robustness. These issues are more pronounced when the effort prediction is used during the early phases of the software development lifecycle. Recent works have demonstrated promising results obtained with the use of fuzzy logic. Fuzzy logic based effort prediction systems can deal better with imprecision, which characterizes the early phases of most software development projects, for example requirements development, whose effort predictors along with their relationships to effort are characterized as being even more imprecise and uncertain than those of later development phases, for example design. Fuzzy logic based prediction systems could produce further better estimates provided that various parameters and factors pertaining to fuzzy logic are carefully set. In this paper, we present an empirical study, which shows that the prediction accuracy of a fuzzy logic based effort prediction system is highly dependent on the system architecture, the corresponding parameters, and the training algorithms. Keywords: Effort prediction; Fuzzy logic; COCOMO; Imprecision; Uncertainty GC Low, DR Jeffery, Software development productivity and back-end CASE tools, Information and Software Technology, Volume 33, Issue 9, November 1991, Pages 616-621, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(91)90033-8. (https://www.sciencedirect.com/science/article/pii/0950584991900338) Abstract: Computer-aided software engineering (CASE) technology offers the potential for substantial productivity improvements. With management information systems departments under increasing pressure to improve productivity, CASE technology has been adopted by organizations. However, its adoption is not without risks. The paper examines the productivity results from the use of two different back-end CASE tools for software development in three large Australian organizations. The investigation concludes that overall there is no statistical evidence for a productivity improvement or decline resulting from the use of either of the two back-end CASE tools studied. Close evaluation of individual projects reveals support for traditional learning-curve patterns and the importance of staff training in new technology. Keywords: computer-aided software engineering; CASE; CASE tools; productivity Leo Højsholt-Poulsen, Production of educational software: The situation in Denmark, Education and Computing, Volume 5, Issues 1–2, 1989, Pages 11-15, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(89)80004-4. (https://www.sciencedirect.com/science/article/pii/S0167928789800044) Abstract: This paper describes how, as a general trend, the development of software for secondary education in Europe has divided into two paths: the development of general application tools, and the development of smaller pieces of subject-oriented lessonware. In Denmark, it is recommended that software be designed in the philosophy of the ‘market place’ model, along the lines of such software as ‘Guide’. The majority of the software, though, is still designed with a linear structure, more or less advanced. An example of educational software is given, in the form of lessonware for physics education. Orfeus is a new Danish organization, which has been established by the local educational authorities in charge of schools. The aim of Orfeus is to provide software for Danish schools in greater quantity and of better quality, at the same time encouraging teachers and students to use available software. Key Words: Circuit; Guide model; Hypertext model; Lessonware; Linear model; Market place model; Orfeus; Physics; Software design model; Software evaluation; Software production Shirley Cruz, Fabio Q.B. da Silva, Luiz Fernando Capretz, Forty years of research on personality in software engineering: A mapping study, Computers in Human Behavior, Volume 46, May 2015, Pages 94-113, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2014.12.008. (https://www.sciencedirect.com/science/article/pii/S0747563214007237) Abstract: Abstract In this article, we present a systematic mapping study of research on personality in software engineering. The goal is to plot the landscape of current published empirical and theoretical studies that deal with the role of personality in software engineering. We applied the systematic review method to search and select published articles, and to extract and synthesize data from the selected articles that reported studies about personality. Our search retrieved more than 19,000 articles, from which we selected 90 articles published between 1970 and 2010. Nearly 72% of the studies were published after 2002 and 83% of the studies reported empirical research findings. Data extracted from the 90 studies showed that education and pair programming were the most recurring research topics, and that MBTI was the most used test. Research related to pair programming, education, team effectiveness, software process allocation, software engineer personality characteristics, and individual performance concentrated over 88% of the studies, while team process, behavior and preferences, and leadership performance were the topics with the smallest number of studies. We conclude that the number of articles has grown in the last few years, but contradictory evidence was found that might have been caused by differences in context, research method, and versions of the tests used in the studies. While this raises a warning for practitioners that wish to use personality tests in practice, it shows several opportunities for the research community to improve and extend findings in this field. Keywords: Human factors in software engineering; Software psychology; Empirical software engineering; Mapping study; Systematic literature review Paul Ralph, Software engineering process theory: A multi-method comparison of Sensemaking–Coevolution–Implementation Theory and Function–Behavior–Structure Theory, Information and Software Technology, Volume 70, February 2016, Pages 232-250, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2015.06.010. (https://www.sciencedirect.com/science/article/pii/S0950584915001238) Abstract: AbstractContext Software engineering has experienced increased calls for attention to theory, including process theory and general theory. However, few process theories or potential general theories have been proposed and little empirical evaluation has been attempted. Objective The purpose of this paper is to empirically evaluate two previously untested software development process theories – Sensemaking–Coevolution–Implementation Theory (SCI) and the Function–Behavior–Structure Framework (FBS). Method A survey of more than 1300 software developers is combined with four longitudinal, positivist case studies to achieve a simultaneously broad and deep empirical evaluation. Instrument development, statistical analysis of questionnaire data, case data analysis using a closed-ended, a priori coding scheme and data triangulation are described. Results Case data analysis strongly supports SCI, as does analysis of questionnaire response distributions (p < 0.001; chi-square goodness of fit test). Furthermore, case-questionnaire triangulation found no evidence that support for SCI varied by participants’ gender, education, experience, nationality or the size or nature of their projects. Conclusions SCI is supported. No evidence of an FBS subculture was found. This suggests that instead of iterating between weakly-coupled phases (analysis, design, coding, testing), it is more accurate and useful to conceptualize development as ad hoc oscillation between making sense of the project context (Sensemaking), simultaneously improving mental representations of the context and design space (Coevolution) and constructing, debugging and deploying software artifacts (Implementation). Keywords: Process theory; Software process; Case study; Questionnaire Yvonne Dittrich, What does it mean to use a method? Towards a practice theory for software engineering, Information and Software Technology, Volume 70, February 2016, Pages 220-231, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2015.07.001. (https://www.sciencedirect.com/science/article/pii/S095058491500124X) Abstract: AbstractContext Methods and processes, along with the tools to support them, are at the heart of software engineering as a discipline. However, as we all know, that often the use of the same method neither impacts software projects in a comparable manner nor the software they result in. What is lacking is an understanding of how methods affect software development. Objective The article develops a set of concepts based on the practice-concept in philosophy of sociology as a base to describe software development as social practice, and develop an understanding of methods and their application that explains the heterogeneity in the outcome. Practice here is not understood as opposed to theory, but as a commonly agreed upon way of acting that is acknowledged by the team. Method The article applies concepts from philosophy of sociology and social theory to describe software development and develops the concepts of method and method usage. The results and steps in the philosophical argumentation are exemplified using published empirical research. Results The article develops a conceptual base for understanding software development as social and epistemic practices, and defines methods as practice patterns that need to be related to, and integrated in, an existing development practice. The application of a method is conceptualized as a development of practice. This practice is in certain aspects aligned with the description of the method, but a method always under-defines practice. The implication for research, industrial software development and teaching are indicated. Conclusion The theoretical/philosophical concepts allow the explaining of heterogeneity in application of software engineering methods in line with empirical research results. Keywords: Cooperative and human aspects of software engineering; Software engineering methods; Practice theory Erwin Schoitsch, Software processes, assessment and ISO 9000-certification: A user's view, Journal of Systems Architecture, Volume 42, Issue 8, 31 December 1996, Pages 653-661, ISSN 1383-7621, https://doi.org/10.1016/S1383-7621(96)00048-3. (https://www.sciencedirect.com/science/article/pii/S1383762196000483) Abstract: The Austrian Research Centre Seibersdorf and its IT-Department are involved in the development of critical computer systems and in standardization in this field for many years (SAFECOMP '89, '90, '91, '93, IEC SC 65A WG9 and WG10, IEC TC 56, partners in the European initiative ESPITI and the networks ENCRESS and OLOS). The certification process for ISO 9001 started with a pre-audit in December 1993, and the certificate was successfully achieved at the end of June 1994. ISO 9000–3 (somehow more process-related than ISO 9001) and the ESA Software Engineering Standards (lifecycle model, process models) were the key input to the Quality Management (QM) System of the IT-Department. Additionally, the Department of Information Technology has successfully applied for a BOOTSTRAP license early in 1994. Four members of the staff of the IT department are qualified as external BOOTSTRAP assessors at the moment. In preparation for ISO 9000-certification and during BOOTSTRAP-training we learnt much about organizations, process improvement and project management, especially by reviewing our own processes critically as well as reviewing the impact and relevance of the schemes to follow when ISO 9000 certification or BOOTSTRAP licensing is the goal to achieve. Direct as well as indirect business benefits were achieved. Keywords: Quality management; Software engineering standards; Dependability management standards; Quality management standards; Software process assessment; Software process improvement; Dependability; Project management; Critical systems; Safety related systems; Software dependability; ISO 9000-certification David M. Raffo, Marc I. Kellner, Empirical analysis in software process simulation modeling, Journal of Systems and Software, Volume 53, Issue 1, 15 July 2000, Pages 31-41, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(00)00006-6. (https://www.sciencedirect.com/science/article/pii/S0164121200000066) Abstract: Software process simulation modeling is increasingly being used to address a variety of issues from the strategic management of software development, to supporting process improvements, to software project management training. The scope of software process simulation applications ranges from narrow focused portions of the life cycle to longer-term product evolutionary models with broad organizational impacts. This paper discusses some of the important empirical issues that arise in software process simulation modeling. We first address issues concerning real-world data used to (1) establish input parameters to a software process simulation model, and (2) establish actual organizational results against which the model’s results (i.e., outputs) will be compared. On the input side, the challenges include small sample sizes, considerable variability and outliers, lack of desired data, loosely defined metrics, and so forth. On the output side, the paper addresses (1) verification and validation of the model, and (2) quantitative approaches to evaluating model outputs in support of managerial decision making including financial performance using Net Present Value (NPV), multi-criteria utility functions, and Data Envelopment Analysis (DEA). The paper focuses on the stochastic modeling using Monte Carlo simulation. The paper is grounded in the authors’ practical application experiences, and major points are illuminated by examples drawn from that field work. Keywords: Process modeling; Return on investments; Emperical software engineering; Data Envelopment Analysis João Nascimento, Paulo Resende da Silva, João Samartinho, Construction of a Web-based Project Management Simulator: Proposal, Process and Features, Procedia Technology, Volume 9, 2013, Pages 730-739, ISSN 2212-0173, https://doi.org/10.1016/j.protcy.2013.12.081. (https://www.sciencedirect.com/science/article/pii/S2212017313002351) Abstract: Abstract This paper proposes a model of a simulation system of Project Management based on the Web environment (Project Management Virtual Environment - PMVE), its features, and the process of its construction. This system aims to support the process of teaching/learning the subject of project management. When compared with other work in this field, there are two aspects that standout: the use of the Web to enhance the interaction between the agents involved in the teaching/learning process, and the variety of skills considered in the system. More than focusing on a small set of processes dealing with technical work management, the solution presented here involves nearly all of the project management life cycle. The description of this solution starts by presenting the context of the problem being solved, highlighting some of the features of the project management activity. Next, the conceptual perspective behind the construction of the system is presented, followed by the technological solution and respective features. Keywords: Software Engineering; Project Management; Computer Based Trainning B.R Gaines, Hardware engineering and software engineering, Euromicro Newsletter, Volume 3, Issue 2, April 1977, Pages 16-21, ISSN 0303-1268, https://doi.org/10.1016/0303-1268(77)90062-1. (https://www.sciencedirect.com/science/article/pii/0303126877900621) Abstract: This paper is concerned with the teaching of software engineering techniques to electronic engineers. It is noted that the majority of those designing systems around microprocessors have probably been trained as engineers, and suggested that training in software engineering techniques should be based on conventional engineering experience in both designed and manufacturing. Problems with software arise in large part because itsdesign and production is treated quite differently from that of hardware. Concepts such as modularity, process-structuring and virtual machines, are not primarily software ones but reflect concepts and disciplines that are readily assimilated in conventional electronic system design. Even programming, as an activity, can be seen to have its counterpart in other aspects of electronic system design that do not involve computers. Thus the approach to teaching advocated is one that concentrates on the integration of hardware and software design and production techniques and attempts to break down the separation that seems to have grown between them. Robert Urwiler, Narender K. Ramarapu, Ronald B. Wilkes, Mark N. Frolick, Computer-aided software engineering: The determinants of an effective implementation strategy, Information & Management, Volume 29, Issue 4, October 1995, Pages 215-225, ISSN 0378-7206, https://doi.org/10.1016/0378-7206(95)00025-R. (https://www.sciencedirect.com/science/article/pii/037872069500025R) Abstract: This report investigates the determinants of a successful Computer-Aided Software Engineering (CASE) tool implementation. Success was defined as a perceived increase in both the quality of software produced and the productivity of the software developers as a result of the introduction of technology. To investigate the effects of certain environmental conditions on the relative success, a survey was mailed to two hundred members of a specific CASE tool user group. Approximately thirty-five percent responded. The findings indicate that an environment which includes the enforcement of a development methodology and use of metrics contribute to perceived improvements in quality when using CASE. Also, use of metrics, use of consultants, and formal training contribute to perceived improvements in developer productivity. Apparently, the presence of each of these environmental conditions significantly contributes to a successful CASE implementation and is, therefore, a 'determinant' of a successful implementation strategy. Keywords: CASE implementation; Metrics; Productivity; Quality; Software engineering; Systems development Tsutomu Matsumoto, Shigeyasu Kawaji, Model and Software Design for Health Care and Life Support, IFAC Proceedings Volumes, Volume 36, Issue 10, June 2003, Pages 197-202, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)33679-0. (https://www.sciencedirect.com/science/article/pii/S1474667017336790) Abstract: Abstract The importance of developing a intelligent control system has been pointed out for ill structure which includes human such as health care support, life support, education and so on. In order to develop the software system which targets these field, analyzing task and information based on cybernetics are very useful for modeling and software design. Unfortunately, design methodology education is not given to student. Because it is very difficult to derive mathematical equation for the behaviour of object. In this paper, modeling and software design of health care life support system as an typical example of ill structure are described. Main part of this paper has been used as an educational material to final project students who belong to our research group. They learn the design method of the system including human via the tutorial and the educational material. Keywords: Ill structure; Modeling; Patient Model; Disease Model; Clinical Diagnosis Decision Support; Health Care Modeling; Design Method Mohammad Hannan, Analysis of the collaborative activities in software development processes from the perspective of chronotopes, Computers in Human Behavior, Volume 27, Issue 1, January 2011, Pages 248-267, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2010.08.003. (https://www.sciencedirect.com/science/article/pii/S0747563210002463) Abstract: The theory of ‘Chronotope’ was introduced by Mikhail Bakhtin in his study of literary genres and subsequently investigated in the fields of media, education, arts, music, film and other disciplines. Class-room chronotopes analyzing student–teacher collaborative activities in real world have already been investigated by the researchers over a decade, but a similar study is absent in the software world and specially in CSCW (Computer Supported Cooperative Work). The focus of this article is to show how collaborative activities in a software development process in real world might fit into certain types of chronotope, thus applying and extending Bakhtin’s theory of chronotope in the area of software development process and methodologies, providing further motivation for research and applicability of chronotopes in the area of CSCW. Keywords: Chronotope; Dialogism; Software development process; Software design methodology Les Hatton, Repetitive failure, feedback and the lost art of diagnosis, Journal of Systems and Software, Volume 47, Issues 2–3, 1 July 1999, Pages 183-188, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00038-2. (https://www.sciencedirect.com/science/article/pii/S0164121299000382) Abstract: This paper highlights a growing problem as software systems become more and more tightly coupled and complex, that of poor diagnostic capability. Diagnosis has never been particularly sophisticated in software systems, often being an ad hoc process in which programmers receive no training. As a result, significant numbers of failures in modern systems cannot be diagnosed in the sense of being uniquely related to one or more faults and as such they continue to fail. As a result, modern software systems are unique amongst modern engineering systems is being characterised by repetitive and frequently avoidable failure. This paper discusses the strongly related issues of repetitive failure, feedback and diagnosis in a generally light-hearted way but makes a plea for diagnosis to be an essential feature of systems design. All engineering systems fail and this knowledge should be part of the software design process such that inevitable failures can be quickly related to the contributing fault or faults and the system corrected to avoid future re-occurrence. JoséM. Iñesta, Eloy Izquierdo, M. Ángeles Sarti, Software tools for using a personal computer as a timer device to assess human kinematic performance: a case study, Computer Methods and Programs in Biomedicine, Volume 47, Issue 3, August 1995, Pages 257-265, ISSN 0169-2607, https://doi.org/10.1016/0169-2607(95)01686-N. (https://www.sciencedirect.com/science/article/pii/016926079501686N) Abstract: Frequently, the assessment of the physical condition of a sportsman depends on the evaluation of different tests, based on biomechanical performance. The data acquisition in these tests is usually hand made, because its automatization is difficult. But when movements are constrained by means of their specific nature, simple tools can be used to achieve that data acquisition. In this paper, a simple and inexpensive system is described to make use of the timing capabilities of a personal computer (PC) to use it as a timer, with applications in biomechanics and sport training. The data acquisition method is based on a PC that, using a specific programming dealing with event timing, gets signals through the printer port, from a receptor device that detects cuts in an infrared cell beam. Low level procedures are provided that can be used in higher level algorithmic designs, problem dependent, to build specific systems. The case of the evaluation of the Wingate Anaerobic Test is discussed. Keywords: Software design; Algorithm; Biomechanics-instrumentation; Data acquisition; Sports medicine; Wingate anaerobic test Alejandro Alonso, Ramón F. Alfonso, José F. Ruíz, Marisol García-Valls, On the Use of a Railroad Model for Real-Time Systems Teaching and Experimenting, IFAC Proceedings Volumes, Volume 33, Issue 7, May 2000, Pages 197-202, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)39954-8. (https://www.sciencedirect.com/science/article/pii/S1474667017399548) Abstract: Abstract The availability of a physical system, suitable for being controlled by computers and where time requirements can be naturally defined, provides a very useftil basis for teaching and checking practically research results or new technologies, in the field of realtime systems. With this goal in mind, a railroad model has been developed and used in a Real-Time Laboratory Course. In the context of this laboratory, an application to control the train has been implemented. This work was based on the spiral model, UML for supporting the architectural and detailed design and Ada. The course was taken in conjunction with a Software Engineering Laboratory Course, where aspects such as team management, version control, and architectural design were dealt with in more depth. The results were very positive. The students finished their assignments on time. The use of the mentioned basic techniques help on reducing the development time. Keywords: Real-Time Systems; Distributed Computer Control Systems; Software Engineering; Ada Marc I Kellner, Raymond J Madachy, David M Raffo, Software process simulation modeling: Why? What? How?, Journal of Systems and Software, Volume 46, Issues 2–3, 15 April 1999, Pages 91-105, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00003-5. (https://www.sciencedirect.com/science/article/pii/S0164121299000035) Abstract: Software process simulation modeling is increasingly being used to address a variety of issues from the strategic management of software development, to supporting process improvements, to software project management training. The scope of software process simulation applications ranges from narrow focused portions of the life cycle to longer term product evolutionary models with broad organizational impacts. This article provides an overview of work being conducted in this field. It identifies the questions and issues that simulation can be used to address (`why'), the scope and variables that can be usefully simulated (`what'), and the modeling approaches and techniques that can be most productively employed (`how'). It includes a summary of the papers in this special issue of the Journal of Systems and Software, which were presented at the First International Silver Falls Workshop on Software Process Simulation Modeling (ProSim'98). It also provides a framework that helps characterize work in this field, and applies this new characterization scheme to many of the articles in this special issue. This paper concludes by offering some guidance in selecting a simulation modeling approach for practical application, and recommending some issues warranting additional research. H.Joel Jeffrey, Pragmatic design of meetings and presentations, Journal of Systems and Software, Volume 19, Issue 1, September 1992, Pages 85-96, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(92)90022-C. (https://www.sciencedirect.com/science/article/pii/016412129290022C) Abstract: Designing and running good meetings and presentations is a mystery to many software engineers, including many very experienced ones. Many workshops, training courses, and books address the mechanics (overheads, speaking style, and so on), but these provide comparatively little information beyond the most basic reminders, such as “no personal insults” or “make an agenda and stick to it”. These basics do not address such difficult problems as presenting to an audience made up of people from several departments at several levels of management, handling skeptical or oppositional people, what to say to the manager whose authority is being undermined, talking to top management in language they can understand about things they care about (rather than what the programmer thinks they ought to care about), and so on. This article presents an organized and comprehensive approach to designing meetings and presentations. It discusses techniques for addressing the differing attitudes, degrees of knowledge, and concerns of the audience, and presents a step-by-step procedure for designing a presentation that considers the audience's knowledge, attitudes, and perspectives. Charles Ume, Marc Timmerman, Mechatronics instruction in the Mechanical Engineering curriculum at Georgia Tech, Mechatronics, Volume 5, Issue 7, October 1995, Pages 723-741, ISSN 0957-4158, https://doi.org/10.1016/0957-4158(95)00043-5. (https://www.sciencedirect.com/science/article/pii/0957415895000435) Abstract: The increasing use of microcontrollers and microprocessors in a wide variety of consumer and commercial products, laboratory test instruments and equipment, and industrial applications has created a need for Mechatronics education in all engineering disciplines. The subject of Mechatronics is broad, encompassing, and interdisciplinary. How to teach Mechatronics in the various Engineering disciplines is still an open area for discussion. This presentation focuses on the way Mechatronics is taught in George Woodruff School of Mechanical Engineering at Georgia Tech. The courses are structured to teach students design at the micro-chip level, and to teach them how to write assembly language programs for measurement and control. They learn by hands-on experience with interfacing sensors, actuators, and passive and active devices with microprocessors and microcomputers in laboratory exercises. The class includes a final group project and a group class lecture. Each group of three students is required to complete a design project that integrates hardware and software design with electronic interfacing design and mechanical systems analysis. Several such projects and all the laboratory exercises are presented. Matthiesen, Schmidt, Moeser, Munker, The Karlsruhe SysKIT Approach – A Three-step SysML Teaching Approach for Mechatronic Students, Procedia CIRP, Volume 21, 2014, Pages 385-390, ISSN 2212-8271, https://doi.org/10.1016/j.procir.2014.03.136. (https://www.sciencedirect.com/science/article/pii/S2212827114006763) Abstract: Abstract Mechatronic engineers are essential joints within modern product development processes. They closely interact with the classical domains of mechanical, electrical and software engineering in order to support the design of customer oriented products of high integral functionality. Due to this, their day-to-day business in practice is strongly characterized by working with different departments and stakeholders out of different disciplines. For improving their communication and to build up a common understanding, an interdisciplinary model language is needed. The Systems Modeling Language (SysML) is a language for modeling these interdisciplinary technical aspects of a system. This paper presents an educational concept for SysML focusing the students’ abilities. The concept is taught to undergraduate mechatronic students in the fifth semester. In a multidisciplinary course (mechanical, electrical and software engineering) the students have lectures, exercises and a development project. In their development project they have to use SysML for modeling – concepts, prototypes, validation and optimization. So the developed SysML workshop is at the very beginning of the semester. This approach is designed as a two day workshop. It is split into three sessions – introduction, abstract modeling and detailing. At first basic elements and modeling techniques are introduced and the students start with a guided step-by-step example. The complexity of the modeling tasks is raised during the second and third session. During the workshop the undergraduate students start with individual pen and paper modeling, followed by small modeling groups with whiteboards. To simulate the industrial environment they finally interact in teams and build up a complex model, learn how they have to change and modify their model by changing boundary conditions and generating different views targeted for stakeholder domains – simulated by a role-play. After this two day workshop the students should have learned how to model in a non-sequential approach – like in reality - and are well prepared for their development project. The usage of software tools for modeling SysML isn’t part of this workshop. Goal of this approach is to teach the understanding and application of modeling with SysML. Working in a team together at the same model is practiced as well. The concept is currently tested in a pilot study with 10 students. The modeling understanding and knowledge of these students is nearly the same as of the mechatronic students. Aim of the study is to show the comprehensibility of the SysML-teaching approach, the needed time for the whole workshop and to identify lacks in the teaching material. This paper introduces the KIT SysML teaching approach for mechatronic engineering and reflects the concept with reference to publications of comparable development methods. It further points out the differences between teaching SysML to modeling beginners and the productive usage. Keywords: Mechatronics; SysML; Educational Approach; Systems Engineering; Modeling Method Miguel A. Teruel, Elena Navarro, Víctor López-Jaquero, Francisco Montero, Pascual González, A CSCW Requirements Engineering CASE Tool: Development and usability evaluation, Information and Software Technology, Volume 56, Issue 8, August 2014, Pages 922-949, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2014.02.009. (https://www.sciencedirect.com/science/article/pii/S0950584914000512) Abstract: AbstractContext CSRML Tool 2012 is a Requirements Engineering CASE Tool for the Goal-Oriented Collaborative Systems Requirements Modeling Language (CSRML). Objective This work aims at evaluating the usability of CSRML Tool 2012, thus identifying any possible usability flaw to be solved in the next releases of the application, as well as giving a general overview on how to develop a DSL tool similar to the one evaluated in this work by means of Visual Studio Modelling SDK. Method In this evaluation, which was reported by following the ISO/IEC 25062:2006 Common Industry Format for usability tests, 28 fourth-course Computer Science students took part. They were asked to carry out a series of modifications to an incomplete CSRML requirements specification. Usability was assessed by measuring the task’s completion rate, the elapsed time, number of accesses to the help system of the tool and the instructor’s verbal assistance. The participants’ arousal and pleasantness were assessed by analyzing both facial expressions and a USE questionnaire. Results In spite of obtaining high usability levels, the test outcome revealed some usability flaws that should be addressed. Conclusion The important lessons learnt from this evaluation are also applicable to the success of other usability tests as well as to the development of new CASE tools. Keywords: Usability evaluation; CASE tool; CSRML; Requirements engineering; CSCW; ISO/IEC 25062:2006 Martin Kuehnhausen, Victor S. Frost, Application of the Java Message Service in mobile monitoring environments, Journal of Network and Computer Applications, Volume 34, Issue 5, September 2011, Pages 1707-1716, ISSN 1084-8045, https://doi.org/10.1016/j.jnca.2011.06.003. (https://www.sciencedirect.com/science/article/pii/S1084804511001159) Abstract: Distributed systems and sensor networks in particular are in need of efficient asynchronous communication, message security and integrity, and scalability. These points are especially important in mobile environments where mobile remote sensors are connected to a control center only via intermittent communication. We present a general approach for dealing with the issues that arise in such scenarios. This approach is applied to provide flexible and efficient cargo monitoring on trains. The Java Message Service (JMS) presents a flexible transport layer for asynchronous communication that enables transparent store-and-forward queuing for entities that need to be connected to each other. Previously JMS was primarily used in always-connected high-bandwidth enterprise communication systems. We present the advantages of using JMS in a mobile, bandwidth-limited, and intermittently connected monitoring environment and provide a working implementation called the Transportation Security SensorNet (TSSN) that makes use of an implementation of JMS called ActiveMQ. This solution is employed here to enable monitoring of cargo in motion along trusted corridors. Results obtained from experiments and a field trial show that using JMS provides not just a practical alternative to often custom binary communication layers, but a better and more flexible approach, by providing transparency. Applications on both communication ends only need to implement JMS connectors while the remaining functionality is provided by the JMS implementation. Another benefit arises from the exchangeability of JMS implementations. In utilizing JMS we demonstrate a new, flexible and scalable approach to cope with challenges inherent in intermittent and low-bandwidth communication in mobile monitoring environments. Keywords: Telemetry; Transport protocols; Intermittently connected wireless networks; Communication system software; Data communication; Software engineering Wolfgang Halang, Carlos Eduardo Pereira, Recommendations for a Real Time Systems Curriculum, IFAC Proceedings Volumes, Volume 28, Issue 18, September 1995, Pages 69-74, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)45128-7. (https://www.sciencedirect.com/science/article/pii/S1474667017451287) Abstract: Abstract An outline of a syllabus for the education of real time systems engineers is given. This comprises the treatment of basic concepts, real time software engineering and programming in high level real time languages, real time operating systems with special emphasis to such topics as task scheduling, hardware architectures and especially distributed automation structures, process interfacing, system reliability and fault tolerance, and finally integrated project development support systems. Accompanying course material and laboratory work are outlined, such as suggestions for establishing a laboratory with advanced, but low-cost, hardware and software are provided. Keywords: real-time systems; real-time languages; real-time operating systems W. Thury, F. Walter, How to Develop Reliable Microprocessor Software Systems for Process Control, IFAC Proceedings Volumes, Volume 18, Issue 11, September 1985, Pages 539-544, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)60180-0. (https://www.sciencedirect.com/science/article/pii/S1474667017601800) Abstract: Abstract The triumphal march of innovative microelectronics continues. The rapidly increasing industrial use of microprocessors not only poses technical problems for the development of machines, products and installations, but also the organisation of development, support and maintenace, as well as the training of qualified workers, call for engagement and considerable investment. The central task is the development of especially reliable software. The lecture touches on the situation regarding the development of microprocessor-controlled systems and demonstrates how to keep development risk and costs under control by using specialized development systems: the UNIX-based, universally-usable tools system CAMIC/S will be presented; its successful use in industry will be proved using facts and figures. Keywords: Universal development system; microprocessor projects; software tools; cross software; test aids; UNIX (UNIX is a registered trade mark of Bell Laboratories); facts and figures from experience George Papazafeiropoulos, Miguel Muñiz-Calvente, Emilio Martínez-Pañeda, Abaqus2Matlab: A suitable tool for finite element post-processing, Advances in Engineering Software, Volume 105, March 2017, Pages 9-16, ISSN 0965-9978, https://doi.org/10.1016/j.advengsoft.2017.01.006. (https://www.sciencedirect.com/science/article/pii/S0965997816306512) Abstract: Abstract A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph-plotting features of Matlab but also opens up new opportunities in results post-processing, statistical analysis and mathematical optimization, among many other possibilities. The software architecture and usage are appropriately described and two problems of particular engineering significance are addressed to demonstrate its capabilities. Firstly, the software is employed to assess cleavage fracture through a novel 3-parameter Weibull probabilistic framework. Then, its potential to create and train neural networks is used to identify damage parameters through a hybrid experimental–numerical scheme, and model crack propagation in structural materials by means of a cohesive zone approach. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com. Keywords: Abaqus2Matlab; Post-processing; Finite Element Method; Weibull stress model; Inverse analysis N.B. Šerbedžija, V.M. Todorović, Real-time Simulation of Water Distribution System, IFAC Proceedings Volumes, Volume 18, Issue 14, October 1985, Pages 247-252, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)60065-X. (https://www.sciencedirect.com/science/article/pii/S147466701760065X) Abstract: Abstract In this paper real-time simulation of water distribution system is presented. Purpose of the system is to help understanding water distribution system performances, to prepare plans for normal and emergency conditions and dispatcher training. Special attention is paid to software design which allows real-time model behavior, multi-user interaction, ad hoc processing of input commands, and various CRT displays. Software architecture is parallel: the system is divided into three independent processes (input, output, calculation). Processes execute concurrently and special primitives are defined for their communication and synchronization. Man-machine interface is designed to meet requirement of user-friendly interaction. The system provides two regimes of work and several speeds of simulation. It records simulation history, allows repetition of a previous simulation starting from the arbitrarily chosen time point, and provides efficient tools for monitoring and analyzing behavior of the large scale water distribution system. Keywords: Computer software; man-machine systems; parallel processing; concurrent programming; process simulation; water resources Shaowen Wang, Yan Liu, Nancy Wilkins-Diehr, Stuart Martin, SimpleGrid toolkit: Enabling geosciences gateways to cyberinfrastructure, Computers & Geosciences, Volume 35, Issue 12, December 2009, Pages 2283-2294, ISSN 0098-3004, https://doi.org/10.1016/j.cageo.2009.05.002. (https://www.sciencedirect.com/science/article/pii/S0098300409002210) Abstract: Cyberinfrastructure science and engineering gateways have become an important modality to connect science and engineering communities and cyberinfrastructure. The use of cyberinfrastructure through gateways is fundamental to the advancement of science and engineering. However, learning science gateway technologies and developing science gateways remain a significant challenge, given that science gateway technologies are still actively evolving and often include a number of sophisticated components. A geosciences gateway must be designed to accommodate legacy methods that geoscientists use in conventional computational tools. The research described in this paper establishes an open-source toolkit—SimpleGrid for learning and developing science gateways based on a service-oriented architecture using a component-based approach that allows flexible separation and integration of the components between geocomputation applications and cyberinfrastructure. The design and implementation of SimpleGrid is based on the National Science Foundation TeraGrid—a key element of the U.S. and world cyberinfrastructure. This paper illustrates our experience of using SimpleGrid and a spatial interpolation method in a tutorial to teach TeraGrid science gateways. Keywords: Component-based software engineering; Cyberinfrastructure; Grid computing; Science and engineering gateways; Service-oriented architecture; Spatial interpolation Oytun Eriş, Veysel G. Anik, Uğur Yildirim, Mehmet T. Söylemez, Salman Kurtulan, Comparison of the Parallel and Serial Architectures for N-Version Programming as Applied to Railway Interlocking Systems, IFAC Proceedings Volumes, Volume 46, Issue 25, 2013, Pages 60-64, ISSN 1474-6670, https://doi.org/10.3182/20130916-2-TR-4042.00032. (https://www.sciencedirect.com/science/article/pii/S1474667015352083) Abstract: Abstract The concept of functional safety gains importance with the increasing number of hazardous accidents in the railway industry. In literature, some hardware and software architectures are proposed for the functional safety. As N-version programming is getting popular as preferred software architecture in railway industry, the effect of various hardware implementations of N-version programming on the functional safety remains unclear. In this study, two different hardware setups will be evaluated for N-version programming. After the effect of these hardware setups on the functional safety is analyzed, the effects on the hardware usage and overall response time will be tested on a sample train station. Keywords: N-version programming; Interlocking System; PLC; Railway signalization; Functional Safety Applications for transportation K. Moidu, O. Wigertz, E. Trell, Multi centre systems analysis study of primary health care: A study of socio-organizational and human factors, International Journal of Bio-Medical Computing, Volume 30, Issue 1, January 1992, Pages 27-42, ISSN 0020-7101, https://doi.org/10.1016/0020-7101(92)90059-2. (https://www.sciencedirect.com/science/article/pii/0020710192900592) Abstract: The information management systems to support health programmes are inadequate. As computers become cheaper and more powerful, their application in the strengthening of the information infrastructure becomes more feasible. However, the high cost of specialized applications software limits their potential, especially in developing countries. A multi-centre systems analysis, (a descriptive study using a questionnaire), was made of District Health Sites in developing countries to analyse whether a common specialized application software design for implementation at a primary health care centre was feasible. Responses to the questionnaires by physicians at the primary health centres were compared between district health sites using contingency tables. Significant inter-site differences in social factors existed, respondents had no prior experience, but with near unanimity (98%) accepted the idea of computer assistance in their work. However, general reservations (31%) and fears (26%) about computer interference in the doctor-patient relationship were expressed. The human factor must be considered in interface design and training before implementation. Keywords: Computers; Developing country; End-users; Information systems; Primary health care; Physicians; Systems analysis study Sigmund Akselsen, Gunnar Hartvigsen, Kjell-Roald Langseth, Experiences from the use of the Grimstad-model for design and implementation of educational software, Education and Computing, Volume 7, Issues 3–4, 1991, Pages 253-265, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(09)90016-4. (https://www.sciencedirect.com/science/article/pii/S0167928709900164) Abstract: The objective of educational software is to encourage the user to increase his knowledge of a specific domain. The pedagogical goals are achieved through a high degree of user control and they enforce heavy claims on the design process. A common platform for instructional designers and computer scientists to construct educational software is needed. This article outlines the Grimstad-model for design and implementation of educational software. We present some snapshots from a project in which elements of the model were used. The simulation program developed in the project shows important issues to consider when keeping reindeer. According to experiences gained from the practical use of the Grimstad-model and from giving teachers' further education courses, we propose extensions to the model. Keywords: the Grimstad-model; computer-aided instruction; software design; software development; educational computing J.Barrie Thompson, Helen M. Edwards, Providing new graduate opportunities: experiences with a UK master’s level computing conversion course, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 135-143, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00086-2. (https://www.sciencedirect.com/science/article/pii/S0164121299000862) Abstract: The forces that operate within the software industry are outlined and the case is made for postgraduate software engineering education that is flexible in terms of attendance patterns and responsive to the market in terms of curricula. The developments within a postgraduate “conversion” course at the University of Sunderland over a 9 year period with regard to both patterns of delivery/attendance and curricula are then described (including the origins of the course and the development of five different delivery modes). The profile of graduates at entry to the course is illustrated, and thumbnail sketches of several graduates from the course are drawn to reflect the diversity of their subsequent careers. The content of the original and revised versions of the course are outlined, and comparison is made of the approaches adopted towards the treatment of Software Engineering (SE) in the two versions. Finally, a critical appraisal is presented with regard to the changes introduced in the course. M.H. Selamat, C.Y. Choong, A.T. Othman, MM Rahim, Non-use phenomenon of CASE tools: Malaysian experience, Information and Software Technology, Volume 36, Issue 9, September 1994, Pages 531-537, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(94)90098-1. (https://www.sciencedirect.com/science/article/pii/0950584994900981) Abstract: Although computer-aided software engineering (CASE) technology has received widespread acceptance in the Western information systems community, many organizations in Malaysia are encountering difficulties in introducing CASE to automate their systems development tasks. It has been reported that CASE tools have failed to achieve success in a considerable number of Malaysian organizations. Therefore a study has been undertaken to examine the reasons that led to non-use of CASE tools. It is found that in Malaysia, the problems of non-use of CASE are human oriented, relating to CASE users, software managers and CASE vendors. Surprisingly, technical issues were found to have less impact in abandoning CASE products. The findings of this study are also compared with that of others conducted in the UK and USA. Some interesting observations are highlighted. Finally, it is advised that organizations should not move too quickly from the manual systems development approach to CASE environment. A strong commitment, adequate training and appreciation programs will, hopefully, minimize many of these problems. Keywords: computer-aided software engineering; CASE failure; non-use phenomenon Heinrich C. Mayr, Christian Urich, A CIM Based Approach to Software Project Management, IFAC Proceedings Volumes, Volume 29, Issue 2, September 1996, Pages 65-70, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)43779-7. (https://www.sciencedirect.com/science/article/pii/S1474667017437797) Abstract: Abstract Despite 27 years of research and education in software engineering, the construction of software systems tends to exceed time and cost estimates considerably. Moreover, software systems still suffer from laxness in requirements analysis, software process management and quality assurance. Many reasons for that problem have been stated as arguments for introducing a new concept, method, model or tool. This paper takes another approach: Starting from the assumption that the reasons are more fundamental, i.e., that the engineering paradigm might not be adequate for all scopes of software development, we propose to adopt well known and proven methods from another discipline, namely industrial production. It will be shown that software development may be seen from a manufacturing point of view and thus may be managed using PPC (production planning and control) concepts. This will lead us to what we will call computer integrated software manufacturing (CISM). Keywords:: :production control and planning; software engineering; software production; software project management; capability maturity model; software process; work breakdown structure; work schedule; computer integrated software manufacturing (CISM) Barbara Kitchenham, David Budgen, Pearl Brereton, Philip Woodall, An investigation of software engineering curricula, Journal of Systems and Software, Volume 74, Issue 3, 1 February 2005, Pages 325-335, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.03.016. (https://www.sciencedirect.com/science/article/pii/S0164121204000548) Abstract: We adapted a survey instrument developed by Timothy Lethbridge to assess the extent to which the education delivered by four UK universities matches the requirements of the software industry. We propose a survey methodology that we believe addresses the research question more appropriately than the one used by Lethbridge. In particular, we suggest that restricting the scope of the survey to address the question of whether the curricula for a specific university addressed the needs of its own students, allowed us to identify an appropriate target population. However, our own survey suffered from several problems. In particular the questions used in the survey are not ideal, and the response rate was poor. Although the poor response rate reduces the value of our results, our survey appears to confirm several of Lethbridge's observations with respect to the over-emphasis of mathematical topics and the under-emphasis on business topics. We also have a close agreement with respect to the relative importance of different software engineering topics. However the set of topics, that we found were taught far less than their importance would suggest, were quite different from the topics identified by Lethbridge. Keywords: Software engineering curricula; Survey methods Michael G Murphy, Teaching software project management: a response–interaction approach, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 145-148, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00087-4. (https://www.sciencedirect.com/science/article/pii/S0164121299000874) Abstract: Southern Polytechnic State University has recently implemented a new Master of Science in Software Engineering degree, which includes a course in Software Project Management in its core requirements. This paper addresses an innovative approach to teaching this course through what is described as response–interaction. Also included are the results of the first offering of this course. Keywords: Software project management; Response–interaction; Software development Yani Widyani, Experience in Software Development Project Course, Procedia Technology, Volume 11, 2013, Pages 1018-1026, ISSN 2212-0173, https://doi.org/10.1016/j.protcy.2013.12.289. (https://www.sciencedirect.com/science/article/pii/S221201731300443X) Abstract: Abstract This paper describes an experience in delivering sofware development project course at Informatic Engineering Undergraduate Program in ITB. The objective is to propose a learning process model in Software Engineering course. This model can be an alternative in learning process that can improve the student's knowledge and skills in software development practices. According to the study program curriculum, an ability in development of small to medium scale software is one of several learning outcomes that must be achieved by our graduate. To achieve this learning outcome, we give practical experience in applying one method to develop a medium scale software through software development project course. This course is conducted in the form of a ‘real’ software development project. The students are allocated into several groups to give an opportunity to work in team. One medium software development project is assigned to each group. Since each project is part of a larger project, completion of this medium scale software development project will produce a large scale software system. Using the iterative and incremental approach known as the Unified Process, each group conducted a full software development life cycle: defining a software requirement specification, requirement analysis, design modeling, coding, and testing. Evaluation of this course in several semesters showed that the project-based courses can improve the students understanding about software engineering. The Unified Process is considered to be an appropriate method for this software development project. Despite of the several obstacles that were still encountered, this course model can still be improved to achieve the ultimate goal of this course. Keywords: course; software developmet; project; unified process Éric Germain, Pierre N. Robillard, Engineering-based processes and agile methodologies for software development: a comparative case study, Journal of Systems and Software, Volume 75, Issues 1–2, 15 February 2005, Pages 17-27, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2004.02.022. (https://www.sciencedirect.com/science/article/pii/S016412120400038X) Abstract: The emergence of various software development methodologies raises the need to evaluate and compare their efficiencies. One way of performing such a comparison is to have different teams apply different process models in the implementation of multiple versions of common specifications. This study defines a new cognitive activity classification scheme which has been used to record the effort expended by six student teams producing parallel implementations of the same software requirements specifications. Three of the teams used a process based on the Unified Process for Education (UPEDU), a teaching-oriented process derived from the Rational Unified Process. The other three teams used a process built around the principles of the Extreme Programming (XP) methodology. Important variations in effort at the cognitive activity level between teams shows that the classification could scarcely be used without categorization at a higher level. However, the relative importance of a category of activities aimed at defining “active” behaviour was shown to be almost constant for all teams involved, possibly showing a fundamental behaviour pattern. As secondary observations, aggregate variations by process model tend to be small and limited to a few activities, and coding-related activities dominate the effort distribution for all the teams. Keywords: Empirical software engineering; Process measurement; Cognitive activity; Productivity W.K. Chan, M.Y. Cheng, S.C. Cheung, T.H. Tse, Automatic goal-oriented classification of failure behaviors for testing XML-based multimedia software applications: An experimental case study, Journal of Systems and Software, Volume 79, Issue 5, May 2006, Pages 602-612, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2005.05.031. (https://www.sciencedirect.com/science/article/pii/S0164121205001226) Abstract: When testing multimedia software applications, we need to overcome important issues such as the forbidding size of the input domains, great difficulties in repeating non-deterministic test outcomes, and the test oracle problem. A statistical testing methodology is proposed. It applies pattern classification techniques enhanced with the notion of test dimensions. Test dimensions are orthogonal properties of associated test cases. Temporal properties are being studied in the experimentation in this paper. For each test dimension, a pattern classifier is trained on the normal and abnormal behaviors. A type of failure is said to be classified if it is recognized by the classifier. Test cases can then be analyzed by the failure pattern recognizers. Experiments show that some test dimensions are more effective than others in failure identification and classification. Keywords: Software testing; Test dimensions; Multimedia application testing; Failure identification; Failure classification Y Zhang, P Hitchcock, EMS: case study in methodology for designing knowledge-based systems and information systems, Information and Software Technology, Volume 33, Issue 7, September 1991, Pages 518-526, ISSN 0950-5849, https://doi.org/10.1016/0950-5849(91)90096-T. (https://www.sciencedirect.com/science/article/pii/095058499190096T) Abstract: The paper provides a computer-aided software engineering (CASE) tool environment for linking structured systems analysis, formal system specification in Z., and system implementation in CPD. The proposed approach is applicable to the design and execution of conceptual schemas appropriate to a relational database environment and to deductive database applications. The methodology is illustrated with a detailed application — the Education Management System (EMS). Keywords: systems analysis; methodologies; formal methods; knowledge-based systems; Prolog Charles W. Rosenthal, Rik Vigeland, An update on a maturity benchmarking process for electronic design processes, Computers in Industry, Volume 30, Issue 1, 1 September 1996, Pages 5-11, ISSN 0166-3615, https://doi.org/10.1016/0166-3615(96)00021-8. (https://www.sciencedirect.com/science/article/pii/0166361596000218) Abstract: Industrial organizations are seeking means for improving their competitive positions by better management of their activities. For electronics organizations, one of the activities receiving attention is the product design process. Improving the quality, productivity and cost of an electronic design process should start with an assessment of the current process. A new, quantitative method is in use for assessing process performance. The method was devised starting from a model of an excellent design process. The model was constructed from the combined judgments of experienced managers and engineers and from criteria included in development improvement programs. A questionnaire has been designed to inquire about the use of the excellent model practices. The questionnaire is completed by a cross-section of the team using the current design process. The questionnaire's responses characterize the process and highlight its strengths and weaknesses. The characterization leads directly to a prioritized list of suggested improvements. The responses are also used to benchmark process maturity employing a scale derived from the five-step scale developed for software design processes by the Software Engineering Institute. The benchmark measurement is used to monitor subsequent improvement in the process as changes are made and to compare a process with industry profiles. The benchmarking method has been used on sixty-one assessments since its introduction in 1990. Using these assessments as an indicator of industry practices we conclude that there are weaknesses in customer requirements gathering, customer involvement in development, in training and support of managers and engineers, and in the use of aids for electronic design and component data transfer. The mean performance level for the processes is just above 2 on a scale of 1 to 5. Keywords: Maturity benchmarking process; Electronic design processes; Assessing process performance Barbara Kitchenham, Pearl Brereton, A systematic review of systematic review process research in software engineering, Information and Software Technology, Volume 55, Issue 12, December 2013, Pages 2049-2075, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2013.07.010. (https://www.sciencedirect.com/science/article/pii/S0950584913001560) Abstract: AbstractContext Many researchers adopting systematic reviews (SRs) have also published papers discussing problems with the SR methodology and suggestions for improving it. Since guidelines for SRs in software engineering (SE) were last updated in 2007, we believe it is time to investigate whether the guidelines need to be amended in the light of recent research. Objective To identify, evaluate and synthesize research published by software engineering researchers concerning their experiences of performing SRs and their proposals for improving the SR process. Method We undertook a systematic review of papers reporting experiences of undertaking SRs and/or discussing techniques that could be used to improve the SR process. Studies were classified with respect to the stage in the SR process they addressed, whether they related to education or problems faced by novices and whether they proposed the use of textual analysis tools. Results We identified 68 papers reporting 63 unique studies published in SE conferences and journals between 2005 and mid-2012. The most common criticisms of SRs were that they take a long time, that SE digital libraries are not appropriate for broad literature searches and that assessing the quality of empirical studies of different types is difficult. Conclusion We recommend removing advice to use structured questions to construct search strings and including advice to use a quasi-gold standard based on a limited manual search to assist the construction of search stings and evaluation of the search process. Textual analysis tools are likely to be useful for inclusion/exclusion decisions and search string construction but require more stringent evaluation. SE researchers would benefit from tools to manage the SR process but existing tools need independent validation. Quality assessment of studies using a variety of empirical methods remains a major problem. Keywords: Systematic review; Systematic literature review; Systematic review methodology; Mapping study Oliver Laitenberger, Jean-Marc DeBaud, Perspective-based reading of code documents at Robert Bosch GmbH, Information and Software Technology, Volume 39, Issue 11, 1997, Pages 781-791, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(97)00030-X. (https://www.sciencedirect.com/science/article/pii/S095058499700030X) Abstract: Despite dramatic changes in software development in the two decades since the term software engineering was coined, software quality deficiencies and cost overruns continue to afflict the software industry. Inspections, developed at IBM by Fagan in the early 1970s [1], can be used to improve upon these problems because they allow the detection and removal of defects after each phase of the software development process. But, in most published inspection processes, individuals performing defect detection are not systematically supported. There, defect detection depends heavily upon factors like chance or experience. Further, there is an ongoing debate in the literature whether or not defect detection is more effective when performed as a group activity and hence should be conducted in meetings [5,11,13,14]. In this article we introduce Perspective-based Reading (PBR) for code documents, a systematic technique to support individual defect detection. PBR offers guidance to individual inspectors for defect detection. This guidance is embodied within perspective-based algorithmic scenarios which makes individual defect detection independent of experience. To test this assumption, we tailored and introduced PBR in the inspection process at Robert Bosch GmbH. We conducted two training sessions in the form of a 2 × 3 fractional-factorial experiment in which 11 professional software developers reviewed code documents from three different perspectives. The experimental results are: (1) Perspectivebased Reading and the type of document have an influence on individual defect detection, (2) multi-individual inspection meetings were not very useful to detect defects, (3) the overlap of detected defects among inspectors using different perspectives is low, and (4) there are no significant differences with respect to defect detection between inspectors having experiences in the programming language and/or the application domain and those that do not. Keywords: Quality assessment; Defect detection; Experimentation; Inspection; Perspective-based reading Wee Wee Sim, Peggy Brouse, Developing Ontologies and Persona to Support and Enhance Requirements Engineering Activities – A Case Study, Procedia Computer Science, Volume 44, 2015, Pages 275-284, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2015.03.060. (https://www.sciencedirect.com/science/article/pii/S1877050915002963) Abstract: Abstract This paper provides an insight into incorporating persona concept and developing ontologies to support requirements engineering activities via a university course registration web application system case study. The objectives are to examine (1) how the concept of persona, in the context of the concepts of viewpoint, goal, scenario, task, and requirement, may be integrated in a unified environment to enable stakeholders and developers gain a better understanding of target users’ needs and behaviors and identify missing requirements early in the requirements engineering process, and (2) how the concepts and their relationships may be explicitly specified ontologically to help establish a knowledge repository and foster a shared common understanding of target users’ needs and behaviors among developers and stakeholders during the requirements analysis and modeling activity. A five-step iterative ontology development process is developed to help guide developers in the process of building the ontologies for the case study. We present the persona and viewpoint documents created and the ontology specifications specified in Protégé-Frames via applying our ontology development process. Keywords: Ontology; Persona; User Profile; User Modeling; Requirements Engineering; Systems Engineering; Knowledge Engineering. Paolo Bottoni, Mark Minas, Preface: Volume 72, Issue 3, Electronic Notes in Theoretical Computer Science, Volume 72, Issue 3, February 2003, Pages 176-177, ISSN 1571-0661, https://doi.org/10.1016/S1571-0661(05)80620-6. (https://www.sciencedirect.com/science/article/pii/S1571066105806206) Abstract: This volume contains the Proceedings of the Workshop on Graph Transformation and Visual Modelling Techniques (GT-VMT 2002). The Workshop was held in Barcelona, Spain, on October 11 and 12, 2002, as satellite event of the First International Conference on Graph Transformation (ICGT 2002). Background Diagrammatic notations have accompanied the development of technical and scientific disciplines in fields as diverse as mechanical engineering, quantum physics, category theory, and software engineering. In general, diagrammatic notations allow the construction of images associated with an interpretation based on considering as significant some well-defined spatial relations among graphical tokens. These tokens either derive from conventional notations employed in a user community or are elements specially designed to convey some meaning. The notations serve the purpose of defining the (types of) entities one is interested in and the types of relations among these entities. Hence, types must be distinguishable from one another and no ambiguity may arise as to their interpretation. Moreover, the set of spatial relations to be considered must be clearly defined, and the holding of any relation among any set of elements must be decidable. The evolution of diagrammatic notations usually follows a pattern that, from their usage as illustrations of sentences written in some formal or natural language, leads to the definition of "modelling languages". These languages are endowed with rules for the construction of "visual sentences" from some elementary graphical components, and for interpreting the meaning of these sentences with respect to the modeled domain, up to rules for mapping the behaviour of the modeled systems onto the behaviour of the visual elements in the model. Workshop Objectives As diagrammatic notations, such as UML, become widespread in software engineering and visual end user environments, there is an increasing need of formal methods to precisely define the syntax and semantics of such diagrams. In particular, when visual models of systems or processes constitute executable specifications of systems, not only is a non-ambiguous specifications of their static syntax and semantics needed, but also an adequate notion of diagram dynamics. Such a notion must establish links (e.g., morphisms) which relate diagram transformations and transformations of the objects of the underlying domain. The field of Graph Grammars and Graph Transformation Systems has contributed much insight into the solution of these problems, but also other approaches (e.g., meta modelling, constraint-based and other rule-based systems), have been developed to tackle specific issues. The workshop has followed in the line of successful workshops on Graph Transformations and Visual Modelling Techniques, which were before held as satellite events of ICALP'00 and ICALP'01. It has gathered researchers working with different methodologies to discuss the relative merits and weaknesses of the different approaches to problems such as diagram parsing, diagram transformation, integrated management of syntactic and semantic aspects, tool support for working with visual models. The focus has been on methodological aspects rather than on particular technical aspects. Program Committee The papers in this volume were reviewed by the program committee consisting of Workshop program The workshop was scheduled for one and a half day and included a session with an invited talk by Martin Gogolla as well as 12presentations of papers in four regular sessions on Geometry and Visualization, on Frameworks and Tools, on Euler/Venn Diagrams, and on Components, Models, and Semantics. Joint Session The workshop featured a special session on Case Studies for Visual Modelling Techniques held jointly with the Workshop on Software Evolution Through Transformations (SET 2002). This session was part of the work carried out under the European research training network SegraVis (for Syntactic and Semantic Integration of Visual Modelling Techniques) with the objective to employ, evaluate, and improve visual modelling techniques in specific domains, including (but not limited to) modelling support for software evolution and refactoring modelling of component-based software architectures specification of applications with mobile soft- and hardware Beside a general discussion of these objectives, the session consisted in presentations of three submitted case studies and position statements by the SegraVis objective coordinators. Acknowledgement This workshop was supported by the European research training network Segra Vis. Anne Lee Paxton, Edward J. Turner, The application of human factors to the needs of the novice computer user, International Journal of Man-Machine Studies, Volume 20, Issue 2, February 1984, Pages 137-156, ISSN 0020-7373, https://doi.org/10.1016/S0020-7373(84)80014-0. (https://www.sciencedirect.com/science/article/pii/S0020737384800140) Abstract: In this article the literature on the application of human factors to the needs of the novice or inexperienced computer user was reviewed. The need for research in the area was illustrated by the fact that an increasing number of people who are not computer professionals are using computers routinely in their jobs. A need for the development of man-computer systems that are maximally suited to the users' needs and preferences was indicated. The needs of the manager as a naive or novice computer user were described as a case in point. Methods to assist members of the university community obtain maximum benefit from computer facilities were also reviewed. The importance of applying human factors to software design as well as the overall design of the man-computer interface was discussed in the literature along with recommenda-tions for specific design and other techniques that would aid the novice in effective use of the computer. Research on the application of human factors to text editing for the novice was reviewed, and the results indicated that novices work best with an inflexible, natural language based text editor. An examination of the literature provided support for designing help facilities for the novice, such as a help key. Anxiety, attitude, and closure were also discussed in the literature as affecting the learning and performance of the novice computer user. The application of human factors to the training of the novice computer user was another area covered in the review. Literature on programming in the future home was discussed, which included recommendations for making computers more useful in that environment. Various implications for future research were presented, including methods to treat computer anxiety as well as design techniques to assist the novice. Bernard C. Jiang, Development of a machine vision system for education, Computers & Industrial Engineering, Volume 18, Issue 1, 1990, Pages 23-28, ISSN 0360-8352, https://doi.org/10.1016/0360-8352(90)90038-N. (https://www.sciencedirect.com/science/article/pii/036083529090038N) Abstract: This paper discusses the development of a software-flexible, hardware-economical, modular machine vision system for education. The purpose of this development was to avoid major capital investment for the purchase of machine vision equipment and to eliminate extensive software development when machine vision technology advances. A description of the hardware components and software architecture is given, followed by three examples of how this system can be used. The limitations of the system are also discussed. Gianluca Palli, Raffaella Carloni, Claudio Melchiorri, Innovative Tools for Real-Time Simulation of Dynamic Systems, IFAC Proceedings Volumes, Volume 41, Issue 2, 2008, Pages 14612-14617, ISSN 1474-6670, https://doi.org/10.3182/20080706-5-KR-1001.02475. (https://www.sciencedirect.com/science/article/pii/S1474667016413406) Abstract: Abstract In this paper, we present a software architecture, based on RTAI-Linux, for the realtime simulation of dynamic systems and for the rapid prototyping of digital controllers. Our aim is to simplify the testing phase of digital controllers by providing the real-time simulation of the plant with the same interface used for the communication between the control applications and real plant. This unified interface, based on the COMEDI library, allows to switch the controller from the simulated to the real plant without any modification of the control software. Moreover, a set of tools for helping the users in the development of the real-time simulation tasks of the plants have been developed. A great attention has been posed in the automatic generation of symbolic kinematic and dynamic models of robotic manipulators from a description of the robot in terms of kinematic parameters and inertia/center of mass of each link. The system, besides being useful for rapid prototyping of mechatronic control systems, may be used for fault detection, and also as a teaching tool in Mechatronic/Digital Control Courses. A case study, the real-time simulation and control of the PUMA 560 manipulator, is presented and discussed. Keywords: Real-Time Systems; Programming Environments; Control Systems Design; Teaching Tools; Simulators Robert J. Fornaro, Margaret R. Heil, Alan L. Tharp, Reflections on 10 years of sponsored senior design projects: Students win–clients win!, Journal of Systems and Software, Volume 80, Issue 8, August 2007, Pages 1209-1216, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2006.09.052. (https://www.sciencedirect.com/science/article/pii/S0164121206002925) Abstract: Undergraduate computer science degree programs often provide an opportunity for students to experience real software projects as a part of their programs of study. These experiences frequently reside in a course in which students form software development teams, are assigned to a project offered by a corporate sponsor and devote one or two semesters to the task of making progress on the project. In an ideal model, faculty mentor student teams who, in turn, behave as subcontractors or consultants to the sponsor. Students work for a grade, not directly for the sponsor as a true subcontractor would. In the ideal model, students demonstrate what they have learned about software engineering process, as well as their ability to implement programmed solutions. Student teams provide progress reports, both oral and written, and directly experience many of the challenges and successes of true software engineering professionals. This paper reports on one such program after 10 years of operation. The technologies and software development processes of student projects are summarized and presented as an informal survey. Student response is discussed in terms of software systems they produced and how they went about producing them. The maturation of these students as software engineering professionals is also discussed. Keywords: Capstone course; Software engineering; Professional communication Z. Jeremić, J. Jovanović, D. Gašević, Student modeling and assessment in intelligent tutoring of software patterns, Expert Systems with Applications, Volume 39, Issue 1, January 2012, Pages 210-222, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2011.07.010. (https://www.sciencedirect.com/science/article/pii/S0957417411009729) Abstract: This paper presents the design, implementation, and evaluation of a student model in DEPTHS (Design Pattern Teaching Help System), an intelligent tutoring system for learning software design patterns. There are many approaches and technologies for student modeling, but choosing the right one depends on intended functionalities of an intelligent system that the student model is going to be used in. Those functionalities often determine the kinds of information that the student model should contain. The student model used in DEPTHS is a result of combining two widely known modeling approaches, namely, stereotype and overlay modeling. The model is domain independent and can be easily applied in other learning domains as well. To keep student model update during the learning process, DEPTHS makes use of a knowledge assessment method based on fuzzy rules (i.e., a combination of production rules and fuzzy logics). The evaluation of DEPTHS performed with the aim of assessing the system’s overall effectiveness and the accuracy of its student model, indicated several advantages of the DEPTHS system over the traditional approach to learning design patterns, and encouraged us to move on further with this research. Keywords: Intelligent tutoring systems; Student model; Knowledge assessment; Adaptive presentation; Fuzzy logic H.Joel Jeffrey, Human systems analysis in the software engineering curriculum, Journal of Systems and Software, Volume 14, Issue 3, March 1991, Pages 147-153, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(91)90061-A. (https://www.sciencedirect.com/science/article/pii/016412129190061A) Abstract: Software is produced by people, for use by people, most frequently in an organization. As a result, any software engineering project includes related technical objects, concepts, and processes, and human and organizational objects, concepts, and processes. The relations are many and varied, ranging from the most technical to the most specifically human, such as point of view and emotional reaction. Software engineering education must therefore be almost impossibly eclectic. This paper presents a unique approach to the software engineering curriculum, human systems analysis that has been in use at Northern Illinois University for six years. It is designed to enable students to analyze the entire super-system, including the producing community, the software system, and user community, and then use that analysis in all phases of a software project. The curriculum addresses an important area of software engineering in a unique way, is highly unusual in both form and content, and is based on a unique theoretical foundation, Descriptive Psychology. Adam Burbidge, Larry Doyle, Michael Pennotti, High Profile Systems Illustrating Contradistinctive Aspects of Systems Engineering, Procedia Computer Science, Volume 28, 2014, Pages 422-429, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2014.03.052. (https://www.sciencedirect.com/science/article/pii/S187705091400115X) Abstract: Abstract Many modern systems have a high degree of dependence on embedded software in order to perform their required functions. Some examples include transportation systems, hand-held devices, and medical equipment, among others. In designing their products, systems engineers typically take a top-down, process-oriented approach, decomposing a complex system into simpler, easier to manage, subsystems; the system requirements can then be allocated and flowed down as necessary to the appropriate subsystems. Software engineers take a more bottom-up, object-oriented approach, using simple building blocks to create a more complex system, and enhancing their existing blocks with new ones where necessary. In many cases, both techniques must be employed together in order to design a successful system. Although it may have been acceptable in the past for simpler systems to view software as a separate subsystem with a fixed set of requirements, greater complexity of modern systems requires a corresponding improvement in working methodology. With the software playing an increasingly pivotal role, systems engineers must become much more familiar with the architecture of the software than previously; Likewise, software engineers need a systems-level view to understand which aspects of the design could be volatile due to new stakeholders (bringing with them new requirements), technology upgrades, and the changing world in general. Systems whose success or failure play out in the public arena provide a rare opportunity to study the factors that contribute to their outcome. Using two such systems, the Denver International Airport baggage handling system and the Apple iPad, this paper will study some best practices that can lead to project success or failure, and show the importance of a rigorous capture and flow down to both hardware and software of the requirements that must be correct from the start, as well as of designing an architecture that can accommodate the inevitable changes to a system. Designing extensible systems with a tolerance for future changes is a key factor in modern complex systems. The baggage handling system failed in part because of a failure to appreciate the central role of software and an apparent lack of a suitable strategy for handling requirement changes. Methods for creating software which is resilient to change have been well studied; however what may be somewhat lacking even to the present day is a broader education of the existing body of knowledge, and how to integrate it with systems engineering methods. The iPad succeeded where many of its predecessors had failed by a successful application of traditional systems engineering techniques and correctly implementing the hardware elements. Coming from companies with experience in software development, the system extensibility was not an issue in this case. However, the designers of the earlier systems seemingly failed to understand the actual market needs, failed to develop a corresponding set of requirements to meet those needs, and failed to translate those requirements into an integrated hardware/software solution. Keywords: embedded software; change tolerance; requirements management Alberto Sangiovanni Vincentelli, CHALLENGES AND OPPORTUNITIES FOR SYSTEM THEORY IN EMBEDDED CONTROLLER DESIGN, IFAC Proceedings Volumes, Volume 39, Issue 5, 2006, Pages 2-3, ISSN 1474-6670, https://doi.org/10.3182/20060607-3-IT-3902.00004. (https://www.sciencedirect.com/science/article/pii/S1474667015328470) Abstract: Abstract Embedded controllers are essential in today electronic systems to assure that the behaviour of complex systems as cars, airplanes, trains, building security management systems, is compliant to strict safety constraints. I will review the evolution of embedded systems and the challenges that must be faced in their design. I will also present methodologies aimed at simplifying and speeding the design process. The role of hybrid systems in the development of embedded controllers will be outlined. Future applications such as wireless sensor networks in an industrial plant will also be presented. The ability of integrating an exponentially increasing number of transistors within a chip, the ever-expanding use of electronic embedded systems to control increasingly many aspects of the “real world”, and the trend to interconnect more and more such systems (often from different manufacturers) into a global network, are creating a nightmarish scenario for embedded system designers. Complexity and scope are exploding into the three inter-related but independently growing directions mentioned above, while teams are even shrinking in size to further reduce costs. In this scenario the three challenges that are taking center stage are: Heterogeneity and Complexity of the Hardware Platform. The trends mentioned above result in exponential complexity growth of the features that can be implemented in hardware. The integration capabilities make it possible to build real complex system on a chip including analog and RF components. The decision of what to place on a chip is no longer dictated by the amount of circuitry that can be placed on the chip but by reliability, yield and ultimately cost (it is well known that analog and RF components force to use more conservative manufacturing lines with more processing steps than pure digital ICs). Even if manufacturing concerns suggest to implement hardware in separate chips, the resulting package may still be very small given the advances in packaging technology yielding the concept of System-in-Package (SiP). Pure digital chips are also featuring an increasing number of components. Design time, cost and manufacturing unpredictability for deep submicron technology make the use of custom hardware implementations appealing only for products that are addressing a very large market and for experienced and financially rich companies. Even for these companies, the present design methodologies are not yielding the necessary productivity forcing them to increase beyond reason the size of design and verification teams. These IC companies (for example Intel, AMD and TI) are looking increasingly to system design methods to allow them to assemble large chips out of pre-designed components and to reduce validation costs. In this context, the adoption of design models above RTL and of communication mechanism among components with guaranteed properties and standard interfaces is only a matter of time. Embedded Software Complexity. Given the cost and risks associated to developing hardware solutions, an increasing number of companies is selecting hardware platforms that can be customized by reconfiguration and/or by software programmability. In particular, software is taking the lion's share of the implementation budgets and cost. In cell phones, more than 1 Million lines of code is standard today, while in automobiles the estimated number of lines by 2010 is 100 Millions. The number of lines of source code of embedded software required for defense avionics systems is also growing exponentially. However, as this happens, the complexity explosion of the software component causes serious concerns for the final quality of the products and the productivity of the engineering forces. In transportation, the productivity of embedded software writers using the traditional methods of software development ranges in the few tens of lines per day. The reasons for such a low productivity are in the time needed for verification of the system and long redesign cycles that come from the need of developing full system prototypes for the lack of appropriate virtual engineering methods and tools for embedded software. Embedded software is substantially different from traditional software for commercial and corporate applications: by virtue of being embedded in a surrounding system, the software must be able to continuously react to stimuli in the desired way, i.e., within bounds on timing, power consumed and cost. Verifying the correctness of the system requires that the model of the software be transformed to include information that involve physical quantities to retain only what is relevant to the task at hand. In traditional software systems, the abstraction process leaves out all the physical aspects of the systems as only the functional aspects of the code matter. Integration Complexity. A standard technique to deal with complexity is decomposing “top-down” the system into subsystems. This approach, which has been customarily adopted by the semiconductor industry for years, has limitation as a designer or a group of designers has to fully comprehend the entire system and to partition appropriately its various parts, a difficult task given the enormous complexity of today's systems. Hence, the future is one of developing systems by composing pieces that all or in part have already been pre-designed or designed independently by other design groups or even companies. This has been done routinely in vertical design chains for example in the transportation vertical, albeit in a heuristic and ad hoc way. The resulting lack of an overall understanding of the interplay of the sub-systems and of the difficulties encountered in integrating very complex parts causes system integration to become a nightmare in the system industry. For example, Jurgen Hubbert, then in charge of the Mercedes-Benz passenger car division, publicly stated in 2003: “The industry is fighting to solve problems that are coming from electronics and companies that introduce new technologies face additional risks. We have experienced blackouts on our cockpit management and navigation command system and there have been problems with telephone connections and seat heating.“ I believe that in today's environment this state is the rule for the leading system OEMs let them operate in the transportation domain, in multimedia systems, in communication, rather than the exception. The source of these problems is clearly the increased complexity but also the difficulty of the OEMs in managing the integration and maintenance process with subsystems that come from different suppliers who use different design methods, different software architecture, different hardware platforms, different (and often proprietary) Real-Time Operating Systems. Therefore, there is a need for standards in the software and hardware domains that will allow plug-and-play of subsystems and their implementation while the competitive advantage of an OEM will increasingly reside on novel and compelling functionalities. I will present a methodology to cope with some of these problems and that can use hybrid system modeling. I will review how this methodology can be applied to the design of embedded controllers for the automotive industry. Finally I will present the application of the methodology and of hybrid systems to the design of wireless sensor networks in an industrial environment. Keywords: Embedded Systems; Systems Design; Systems Methodology; Control Applications; Distributed Control T.M. Knasel, Artificial intelligence in manufacturing: Forecasts for the use of artificial intelligence in the USA, Robotics, Volume 2, Issue 4, December 1986, Pages 357-362, ISSN 0167-8493, https://doi.org/10.1016/0167-8493(86)90009-4. (https://www.sciencedirect.com/science/article/pii/0167849386900094) Abstract: The use of artificial intelligence in manufacturing has finally emerged as a reality in the United States. Buoyed by the general growth of trained computer scientists, the lower cost hardware and software designed for easier use input, a number of experimental programs are under way to exploit artificial intelligence for manufacturing purposes. Keywords: Artificial Intelligence; Manufacturing; Industrial Application; Engineering Workstations; Expert Systems; Marketing Forecasts; Functional Applications; Government Programs; Industry Infrastructure U. Sens, Knowledge Representation in an Intelligent Tutoring System: Educational and Domain Knowledge for Automatic Control, IFAC Proceedings Volumes, Volume 23, Issue 8, Part 6, August 1990, Pages 423-428, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)51458-5. (https://www.sciencedirect.com/science/article/pii/S1474667017514585) Abstract: Abstract In the area of Intelligent Computer Aided Instruction (ICAI) knowledge representation turns out to be one of the key factors for efficency in Intelligent Tutoring Systems. The paper treats some special aspects of knowledge representation for an intelligent tutoring system (ITS) for applications in the education of engineers in the field of Software-Engineering methods and tools. In an ITS two aspects of knowledge to be represented and three independent functions can be found. The two different and partly independent aspects are:- tutorial knowledge - domain knowledge and the three functions are:- supplying help and assistance to the learner - creating and maintaining a learner model - controling and directing the progress of tutoring together with the learner Therefore we suggest a layered structure based on an object-oriented approach. These layers cover knowledge objects for the identified tasks in regard of several aspects. Each object is composed of declarative and procedural knowledge of the domaine or the tutorial. Keywords: Computer Aided Instruction; Intelligent Tutoring System; CA-Tools; Tutorial Knowledge; Domain Knowledge; Intelligent Help; OOD Chris Twigg, Paul Hasler, Configurable analog signal processing, Digital Signal Processing, Volume 19, Issue 6, December 2009, Pages 904-922, ISSN 1051-2004, https://doi.org/10.1016/j.dsp.2007.09.013. (https://www.sciencedirect.com/science/article/pii/S105120040700142X) Abstract: We present a viewpoint showing that analog signal processing approaches are becoming configurable and programmable like their digital counterparts, while retaining a huge computational efficiency, for a given power budget, compared to their digital counterparts. We present recent results in programmable and configurable analog signal processing describing the widespread potential of these approaches. We discuss issues with configurable systems, including size, power, and computational tradeoffs, as well as address the computational efficiency of these approaches. Analog circuits and systems research and education can significantly benefit from the computational flexibility provided by large-scale FPAAs. The component density of these devices is sufficient to synthesize large systems in a short period of time. However, this level of reconfigurable and programmable complexity requires a development platform and CAD tools to demonstrate the capabilities of large-scale FPAAs before they will be widely accepted. To address this need, a self-contained FPAA setup has been developed along with an integrated software design flow. With only an Ethernet connection and an AC power outlet, a researcher or student can explore the numerous analog circuit possibilities provided by large-scale FPAAs. Keywords: Floating-gate circuits; FPAAs; Floating-gate switches; Analog signal processing; Floating-gate systems Anne Burdick, Holly Willis, Digital learning, digital scholarship and design thinking, Design Studies, Volume 32, Issue 6, November 2011, Pages 546-556, ISSN 0142-694X, https://doi.org/10.1016/j.destud.2011.07.005. (https://www.sciencedirect.com/science/article/pii/S0142694X11000597) Abstract: This paper identifies opportunities for design thinking to be integrated into digital learning and digital scholarship initiatives. The paper traces how the rise of digital culture has led to the reconsideration of models for learning and the call for new modes of knowledge production, spearheaded by an array of fields from writing programs to computer science. Drawing upon case studies from new media education and the digital humanities, the paper argues that design thinking that is situated, interpretive, and user-oriented is well suited to these initiatives. The paper concludes with a call for design thinking research to engage with emerging models for learning and knowledge production, work whose effects could be felt at an epistemic level for generations. Keywords: design knowledge; epistemology; interdisciplinarity; software design; technology R.K. Appiah, J.H. Daigle, The Role of Computers in Engineering Education in Zimbabwe, IFAC Proceedings Volumes, Volume 21, Issue 6, July 1988, Pages 227-233, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)53870-7. (https://www.sciencedirect.com/science/article/pii/S1474667017538707) Abstract: Abstract This paper discusses some problems of engineering manpower formation in Zimbabwe and outlines the steps that have been taken to resolve these problems. One of the steps, the Engineering Microcomputer-Assisted Learning Project at the University of Zimbabwe, is described here. A control systems course is used to illustrate the courseware development activities and the course re-organisation involved in the CAL strategy of the project. The problem of CAL project evaluation is also highlighted. Keywords: Educational aid; Computer-Assisted Learning; teaching; software engineering; control theory Burak Turhan, Gozde Kocak, Ayse Bener, Data mining source code for locating software bugs: A case study in telecommunication industry, Expert Systems with Applications, Volume 36, Issue 6, August 2009, Pages 9986-9990, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2008.12.028. (https://www.sciencedirect.com/science/article/pii/S0957417408009275) Abstract: Abstract In a large software system knowing which files are most likely to be fault-prone is valuable information for project managers. They can use such information in prioritizing software testing and allocating resources accordingly. However, our experience shows that it is difficult to collect and analyze fine-grained test defects in a large and complex software system. On the other hand, previous research has shown that companies can safely use cross-company data with nearest neighbor sampling to predict their defects in case they are unable to collect local data. In this study we analyzed 25 projects of a large telecommunication system. To predict defect proneness of modules we trained models on publicly available Nasa MDP data. In our experiments we used static call graph based ranking (CGBR) as well as nearest neighbor sampling for constructing method level defect predictors. Our results suggest that, for the analyzed projects, at least 70% of the defects can be detected by inspecting only (i) 6% of the code using a Naïve Bayes model, (ii) 3% of the code using CGBR framework. Keywords: Software testing; Defect prediction; Software bugs; Case study Anthony Savidis, Constantine Stephanidis, Inclusive development: Software engineering requirements for universally accessible interactions, Interacting with Computers, Volume 18, Issue 1, January 2006, Pages 71-116, ISSN 0953-5438, https://doi.org/10.1016/j.intcom.2005.06.005. (https://www.sciencedirect.com/science/article/pii/S0953543805000597) Abstract: The notion of ‘universal access’ reflects the concept of an Information Society in which potentially anyone (i.e. any user) will interact with computing machines, at anytime and anyplace (i.e. in any context of use) and for virtually anything (i.e. for any task). Towards reaching a successful and cost effective realization of this vision, it is critical to ensure that the future interface development tools provide all the necessary instrumentation to support inclusive design, i.e. facilitate inclusive development. In the meantime, it is crucial that both tool developers and interface developers acquire awareness regarding the key development features they should pursue when investigating for the most appropriate software engineering support in addressing such a largely demanding development goal (i.e. universally accessible interactions). This paper discusses a corpus of key development requirements for building universally accessible interactions that has been consolidated from real practice, in the course of six medium-to-large scale research projects, all completed, within a 10 years timeframe. Kristian Beckers, Dominik Holling, Isabelle Côté, Denis Hatebur, A structured hazard analysis and risk assessment method for automotive systems—A descriptive study, Reliability Engineering & System Safety, Volume 158, February 2017, Pages 185-195, ISSN 0951-8320, https://doi.org/10.1016/j.ress.2016.09.004. (https://www.sciencedirect.com/science/article/pii/S0951832016305002) Abstract: Abstract The 2011 release of the first version of the ISO 26262 standard for automotive systems demand the elicitation of safety goals following a rigorous method for hazard and risk analysis. Companies are struggling with the adoption of the standard due to ambiguities, documentation demands and the alignment of the standards demands to existing processes. We previously proposed a structured engineering method to deal with these problems developed in applying action research together with an OEM. In this work, we evaluate how applicable the method is for junior automotive software engineers by a descriptive study. We provided the method to 8 members of the master course Automotive Software Engineering (ASE) at the Technical University Munich. The participants have each been working in the automotive industry for 1–4 years in parallel to their studies. We investigated their application of our method to an electronic steering column lock system. The participants applied our method in a first round alone and afterwards discussed their results in groups. Our data analysis revealed that the participants could apply the method successfully and the hazard analysis and risk assessment achieved a high precision and productivity. Moreover, the precision could be improved significantly during group discussions. Keywords: Requirements; ISO 26262; Automotive; Safety; Structured method; Empirical study Tim King, Millwrights to mechatronics: The merits of multi-disciplinary engineering, Mechatronics, Volume 5, Issues 2–3, March–April 1995, Pages 95-115, ISSN 0957-4158, https://doi.org/10.1016/0957-4158(94)E0025-L. (https://www.sciencedirect.com/science/article/pii/0957415894E0025L) Abstract: In recent years the term “Mechatronics” has come into use to describe a multi-disciplinary approach to engineering (and particularly engineering design) in which a symbiosis of mechanical, electrical, electronic, computer and software engineering is used to create new design solutions to engineering problems. These mechatronic designs can often be more effective than traditional solutions rooted in mono-disciplinary engineering. This paper notes that virtually all engineering was once the province of millwrights and discusses its division into the many currently recognised constituent, and largely separate, disciplines. It is argued that this has followed from the principles of efficiency through division of production which have long been a tenet of capitalist manufacturing. In the closing two decades of the twentieth century there has been a move to return to more integrated production techniques and, with the development of microprocessors and microprocessor controlled systems and products, a need for integration in engineering design and education. The ways in which microprocessors have been applied to, or embedded in, contemporary products, machines and systems are categorised and examples of the design of mechatronic devices, with which the author has been associated, are presented. Ed Seidewitz, General object-oriented software development: Background and experience, Journal of Systems and Software, Volume 9, Issue 2, February 1989, Pages 95-108, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90013-7. (https://www.sciencedirect.com/science/article/pii/0164121289900137) Abstract: The effective use of Ada requires the adoption of modern software-engineering techniques such as object-oriented methodologies. A Goddard Space Flight Center Software Engineering Laboratory Ada pilot project has provided an opportunity for studying object-oriented design in Ada. The project involves the development of a simulation system in Ada in parallel with a similar Fortran development. As part of the project, the Ada development team trained and evaluated object-oriented and process-oriented design methodologies for Ada. Finding these methodologies limited in various ways, the team created a general object-oriented development methodology that they applied to the project. This paper discusses some background on the development of the methodology, describes the main principles of the approach, and presents some experiences with using the methodology, including a general comparison of the Ada and Fortran simulator designs. Bogdan Marculescu, Simon Poulding, Robert Feldt, Kai Petersen, Richard Torkar, Tester interactivity makes a difference in search-based software testing: A controlled experiment, Information and Software Technology, Volume 78, October 2016, Pages 66-82, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2016.05.009. (https://www.sciencedirect.com/science/article/pii/S0950584916300957) Abstract: Abstract Context: Search-based software testing promises to provide users with the ability to generate high quality test cases, and hence increase product quality, with a minimal increase in the time and effort required. The development of the Interactive Search-Based Software Testing (ISBST) system was motivated by a previous study to investigate the application of search-based software testing (SBST) in an industrial setting. ISBST allows users to interact with the underlying SBST system, guiding the search and assessing the results. An industrial evaluation indicated that the ISBST system could find test cases that are not created by testers employing manual techniques. The validity of the evaluation was threatened, however, by the low number of participants. Objective: This paper presents a follow-up study, to provide a more rigorous evaluation of the ISBST system. Method: To assess the ISBST system a two-way crossover controlled experiment was conducted with 58 students taking a Verification and Validation course. The NASA Task Load Index (NASA-TLX) is used to assess the workload experienced by the participants in the experiment. Results:The experimental results validated the hypothesis that the ISBST system generates test cases that are not found by the same participants employing manual testing techniques. A follow-up laboratory experiment also investigates the importance of interaction in obtaining the results. In addition to this main result, the subjective workload was assessed for each participant by means of the NASA-TLX tool. The evaluation showed that, while the ISBST system required more effort from the participants, they achieved the same performance. Conclusions: The paper provides evidence that the ISBST system develops test cases that are not found by manual techniques, and that interaction plays an important role in achieving that result. Keywords: Search-based software testing; Interactive search-based software testing; Controlled experiment Jan A. Bergstra, Paul Klint, About “trivial” software patents: The IsNot case, Science of Computer Programming, Volume 64, Issue 3, 1 February 2007, Pages 264-285, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2006.09.003. (https://www.sciencedirect.com/science/article/pii/S0167642306001754) Abstract: So-called “trivial” software patents undermine the patenting system and are detrimental for innovation. In this paper we use a case-based approach to get a better understanding of this phenomenon. First, we establish a baseline for studying the relation between software development and intellectual property rights by formulating a life cycle for the patenting system as well as three variations of the software life cycle: the defensive patent-aware software life cycle that prevents patent infringements, the more offensive patent-based software life cycle that aims both at preventing infringements and at creating new patents, and the IPR-based software life cycle that considers all forms of protection of intellectual property rights including copyright and secrecy. Next, we study an application for a software patent concerning the inequality operator and a granted European patent on memory management. We also briefly mention other examples of trivial patents. These examples serve to clarify the issues that arise when integrating patents in the software life cycle. In an extensive discussion, we cover the difference between expression and idea, the role of patent claims, software patents versus computer implemented inventions, the role of prior art, implications of software patents for open source software, for education, and for government-funded research. We conclude the discussion with the formulation of an “integrity axiom” for software patent authors and owners and sketch an agenda for software patent research. We conclude that patents are too important to be left to lawyers and economists and that a complete reinterpretation of the patenting system from a software engineering perspective is necessary to understand all ramifications of software patents. We end with explicit conclusions and policy recommendations. Keywords: Software patents; Trivial patents; Intellectual property rights; Software engineering; Patent life cycle; Software engineering life cycle; Open source software; Prior art; Patent claims; Patent policy Nur Hidayanti Binti Ambrizal, Awais Farooqi, Osama I. Alsultan, Nukman Bin Yusoff, Design and Development of CNC Robotic Machine Integrate-able with Nd-Yag Laser Device, Procedia Engineering, Volume 184, 2017, Pages 145-155, ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2017.04.079. (https://www.sciencedirect.com/science/article/pii/S1877705817315849) Abstract: Abstract The machining technologies and modern intelligent systems are expensive and require easy handling and integrated-able machine with various devices to perform multiple machining tasks. Computer Numerically Controlled (CNC) machines are accessible by manufacturers to perform several machining tasks due to effectiveness in handling accuracy. The majority of CNC machines are costly due to it complex but efficient machine and software design. This project is aimed to develop a cost effective and easily integrate-able CNC Machine System that is accessible to add-on Nd-Yag Laser Device. The deliberate mechanical design and the path of laser beam entering the machine and exiting the laser-head with intended electronics control structure controlled by CNC software is developed. The designed CNC Nd-Yag laser Machine is a teaching tool for workshop and is targeted to perform laser cutting and suitable engraving or welding tasks for small and medium scale industries. By testing the machine its expected positional accuracy was achieved and the control mechanism for Nd-Yag laser path reflection function was successfully tested. Keywords: CNC Robotics; Laser machining; Cost Effective Manufacturing; Software Control System; Workshop Teaching Tool Görel Hedin, Lars Bendix, Boris Magnusson, Teaching extreme programming to large groups of students, Journal of Systems and Software, Volume 74, Issue 2, 15 January 2005, Pages 133-146, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2003.09.026. (https://www.sciencedirect.com/science/article/pii/S0164121203002930) Abstract: We find the extreme programming methodology highly suitable for introducing undergraduate students to software engineering. To be able to apply this methodology at a reasonable teaching cost for large student groups, we have developed two courses that work in tandem: a team programming course taken by more than 100 students, and a coaching course taken by around 25 students. In this paper we describe our view of how extreme programming fits into the software engineering curriculum, our approach to teaching it, and our experiences, based on two years of running these courses. Particularly important aspects of our set up include team coaching (by older students), fixed working hours, and colocation during development. Our experiences so far are very positive, and we see that students get a good basic understanding of the important concepts in software engineering, rooted in their own practical experience. Beverly Hunter, Designing educational software for the information age: Dilemmas and paradoxes, Education and Computing, Volume 5, Issues 1–2, 1989, Pages 111-117, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(89)80019-6. (https://www.sciencedirect.com/science/article/pii/S0167928789800196) Abstract: Design considerations of the developer of educational software are predominantly concerned with economic and organizational arrangements for disseminating educational materials to schools. As a consequence, design considerations concerning educational needs, pedagogical research or technological opportunities have relatively little impact on software design. This paper identifies considerations concerning:u- interaction among learners, teachers, software and learning environments; - learning processes; - teachers' rôles; - effects of knowledge representation on learning and understanding; - collaborative learning; - evaluation methods; - curriculum reform; - research on cognitive processes; - technological capabilities. Key Words: Educational software; Economic and organizational design constraints; Learning environments; Interaction; Learning processes; Teachers' rôle; Knowledge representation; Collaborative learning; Evaluation methods; Curriculum reform; Cognitive processes; Technological capabilities Margaret J. Cox, The impact of evaluation through classroom trials on the design and development of educational software, Education and Computing, Volume 5, Issues 1–2, 1989, Pages 35-41, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(89)80008-1. (https://www.sciencedirect.com/science/article/pii/S0167928789800081) Abstract: Published papers are available on numerous models for the production of educational software. Teachers have contributed curriculum ideas to many of these. Some Computer Assisted Learning (CAL) development methods include trials of software in schools to provide feedback in the development process. This paper discusses the effectiveness of classroom trials, using an example from the Computers in the Curriculum project. Consideration is given to the relation between this formative development and general research questions concerning the contribution of CAL to learning. Key Words: Computer assisted learning (CAL); Educational software; Design; Development; Evaluation; Classroom trial; Teacher feedback; Third Angle Projection Tsong Yueh Chen, Fei-Ching Kuo, Robert G. Merkel, T.H. Tse, Adaptive Random Testing: The ART of test case diversity, Journal of Systems and Software, Volume 83, Issue 1, January 2010, Pages 60-66, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2009.02.022. (https://www.sciencedirect.com/science/article/pii/S0164121209000405) Abstract: Random testing is not only a useful testing technique in itself, but also plays a core role in many other testing methods. Hence, any significant improvement to random testing has an impact throughout the software testing community. Recently, Adaptive Random Testing (ART) was proposed as an effective alternative to random testing. This paper presents a synthesis of the most important research results related to ART. In the course of our research and through further reflection, we have realised how the techniques and concepts of ART can be applied in a much broader context, which we present here. We believe such ideas can be applied in a variety of areas of software testing, and even beyond software testing. Amongst these ideas, we particularly note the fundamental role of diversity in test case selection strategies. We hope this paper serves to provoke further discussions and investigations of these ideas. Keywords: Software testing; Random testing; Adaptive random testing; Adaptive random sequence; Failure-based testing; Failure pattern W.K. Chan, S.C. Cheung, Jeffrey C.F. Ho, T.H. Tse, PAT: A pattern classification approach to automatic reference oracles for the testing of mesh simplification programs, Journal of Systems and Software, Volume 82, Issue 3, March 2009, Pages 422-434, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2008.07.019. (https://www.sciencedirect.com/science/article/pii/S0164121208001817) Abstract: Graphics applications often need to manipulate numerous graphical objects stored as polygonal models. Mesh simplification is an approach to vary the levels of visual details as appropriate, thereby improving on the overall performance of the applications. Different mesh simplification algorithms may cater for different needs, producing diversified types of simplified polygonal model as a result. Testing mesh simplification implementations is essential to assure the quality of the graphics applications. However, it is very difficult to determine the oracles (or expected outcomes) of mesh simplification for the verification of test results. A reference model is an implementation closely related to the program under test. Is it possible to use such reference models as pseudo-oracles for testing mesh simplification programs? If so, how effective are they? This paper presents a fault-based pattern classification methodology called PAT, to address the questions. In PAT, we train the C4.5 classifier using black-box features of samples from a reference model and its fault-based versions, in order to test samples from the subject program. We evaluate PAT using four implementations of mesh simplification algorithms as reference models applied to 44 open-source three-dimensional polygonal models. Empirical results reveal that the use of a reference model as a pseudo-oracle is effective for testing the implementations of resembling mesh simplification algorithms. However, the results also show a tradeoff: When compared with a simple reference model, the use of a resembling but sophisticated reference model is more effective and accurate but less robust. Keywords: Test oracles; Software testing; Mesh simplification; Graphics rendering; Pattern classification reference models Robert A. Greenes, Stephan R.A. Deibel, The DeSyGNER knowledge management architecture: a building block approach based on an extensible kernel, Artificial Intelligence in Medicine, Volume 3, Issue 2, April 1991, Pages 95-111, ISSN 0933-3657, https://doi.org/10.1016/0933-3657(91)90021-3. (https://www.sciencedirect.com/science/article/pii/0933365791900213) Abstract: The Decision Systems Group has been developing a ‘building block’ approach for creating Knowledge Management (KM) applications for medical education and decision support. Potential functions and knowledge access modes to be supported include query, browsing, testing, simulation, didactic instruction, problem solving, and personal file management. Knowledge is considered to be available in multiple forms, non-adaptive and adaptive. We believe that organization and combination of disparate components, in order to build varied and complex applications as required for KM, is best achieved through a software engineering approach based on a kernel set of functions that provide a consistent set of services for all applications, facilitating extensibility and inter-application compatibility. For this purpose, we are exploring a prototype kernel architecture called DeSyGNER (the Decision Systems Group Nucleus of Extensible Resources). Features addressed by DeSyGNER include methods for decomposition of applications into modular units and identification of their functional dependencies; methods of structuring applications to separate their storage, processing, and presentation components; database requirements for indexing and composing complex structures from disparate, disjoint data elements; and methods to support multi-user cooperative development. Keywords: Knowledge management; modular architecture; software engineering Gianluca Palli, Claudio Melchiorri, REALTIME HARDWARE EMULATION FOR RAPID PROTOTYPING AND TESTING OF DIGITAL CONTROL SYSTEMS, IFAC Proceedings Volumes, Volume 39, Issue 16, 2006, Pages 241-246, ISSN 1474-6670, https://doi.org/10.3182/20060912-3-DE-2911.00044. (https://www.sciencedirect.com/science/article/pii/S1474667015341720) Abstract: Abstract In this paper, a software architecture for the rapid prototyping of digital control systems and for realtime simulation of the control loop, including both controller and plant dynamics, is presented. This system is based on RTAI-Linux, and its main goal is to simplify the testing phase of digital controllers by using an unified interface between the controller and the simulated or real plant. The unified interface, based on the COMEDI library, allows to switch the controller from the simulated to the real plant without any modification of the control software. The system, besides being useful for rapid prototyping of mechatronic control systems, may be used for fault detection, and also as a teaching tool in Mechatronic/Digital Control Courses. A case study, a controller for an inverted pendulum, is presented and discussed. Keywords: Realtime systems; programming environments; control systems design; nonlinear systems; simulators Daniel Johnson, John Gardner, Janet Wiles, Experience as a moderator of the media equation: the impact of flattery and praise, International Journal of Human-Computer Studies, Volume 61, Issue 3, September 2004, Pages 237-258, ISSN 1071-5819, https://doi.org/10.1016/j.ijhcs.2003.12.008. (https://www.sciencedirect.com/science/article/pii/S1071581903002179) Abstract: This study extends previous media equation research, which showed that the effects of flattery from a computer can produce the same general effects as flattery from humans. Specifically, the study explored the potential moderating effect of experience on the impact of flattery from a computer. One hundred and fifty-eight students from the University of Queensland voluntarily participated in the study. Participants interacted with a computer and were exposed to one of three kinds of feedback: praise (sincere praise), flattery (insincere praise), or control (generic feedback). Questionnaire measures assessing participants’ affective state, attitudes and opinions were taken. Participants of high experience, but not low experience, displayed a media equation pattern of results, reacting to flattery from a computer in a manner congruent with peoples’ reactions to flattery from other humans. High experience participants tended to believe that the computer spoke the truth, experienced more positive affect as a result of flattery, and judged the computer's performance more favourably. These findings are interpreted in light of previous research and the implications for software design in fields such as entertainment and education are considered. Keith Stenning, Corin Gurr, Human–formalism interaction: Studies in communication through formalism, Interacting with Computers, Volume 9, Issue 2, 3 November 1997, Pages 111-128, ISSN 0953-5438, https://doi.org/10.1016/S0953-5438(97)00012-X. (https://www.sciencedirect.com/science/article/pii/S095354389700012X) Abstract: A recurrent theme in studying the interaction between human and formalism is the understanding of how people interact with representations in reasoning and communication. In contrast to the best known theories, which approach the question of the impact of representation upon reasoning through explanations in terms of human computational architecture, we present here a more fundamental approach. This approach separates the problem into two parts—issues about computational complexity arising from the nature of the semantic interpretation (issues which are abstract with regard to architecture); and issues about how human computational architecture in particular can be brought to bear on different representations. On this view, for example, diagrams are often logically inexpressive and this is why they lead to efficient inference. This paper presents experiences in applying this semantic approach to the empirical study of modality assignment in three disparate domains: logic teaching, safety critical software engineering and the teaching of formality. We show how, in each of these cases, an account of the semantics of representations in simple formal terms permits the analysis and modelling of what would otherwise be incomprehensibly complicated behavioural phenomena. The results of these apparently diverse studies indicate that individual differences in what might be termed cognitive styles have a significant effect upon a humans use and understanding of various formalisms. This, we argue, is evidence that HCI researchers require a more analytical means to relate the cognitive and social sides of HCI than has previously been available. Furthermore, we also take the studies presented here as evidence that our approach is a substantial step towards providing such a means of analysis. Keywords: Cognitive science; HCI; Representation and reasoning; Teaching and learning; Formal methods Bashar Nuseibeh, Steve Easterbrook, Alessandra Russo, Making inconsistency respectable in software development, Journal of Systems and Software, Volume 58, Issue 2, 1 September 2001, Pages 171-180, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(01)00036-X. (https://www.sciencedirect.com/science/article/pii/S016412120100036X) Abstract: The development of software systems inevitably involves the detection and handling of inconsistencies. These inconsistencies can arise in system requirements, design specifications and, quite often, in the descriptions that form the final implemented software product. A large proportion of software engineering research has been devoted to consistency maintenance, or geared towards eradicating inconsistencies as soon as they are detected. Software practitioners, on the other hand, live with inconsistency as a matter of course. Depending on the nature of an inconsistency, its causes and its impact, they sometimes choose to tolerate its presence, rather than resolve it immediately, if at all. This paper argues for “making inconsistency respectable” [A phrase first used by D. Gabbay and A. Hunter (in: Proceedings of Fundamentals of Artificial Intelligence Research'91, Springer, Berlin, p. 19; in: Symbolic and Quantitative Approaches to Reasoning and Uncertainty, Lecture Notes in Computer Science, Springer, Berlin, 1992, p. 129) to describe the same sentiments that motivated our work.] – sometimes avoided or ignored, but more often used as a focus for learning and as a trigger for further (constructive) development actions. The paper presents a characterization of inconsistency in software development and a framework for managing it in this context. It draws upon practical experiences of dealing with inconsistency in large-scale software development projects and relates some lessons learned from these experiences. Keywords: Software specification; Requirements engineering; Inconsistency management; Inconsistency handling; Conflict K. Henning, P. Ijewski, B. Schürmann, Design of Man-Machine Interfaces and Work Content in a Container Transfer Process, IFAC Proceedings Volumes, Volume 16, Issue 22, November 1983, Pages 23-28, ISSN 1474-6670, https://doi.org/10.1016/S1474-6670(17)61549-0. (https://www.sciencedirect.com/science/article/pii/S1474667017615490) Abstract: Abstract A detailed design of the working places in a high-level automation of a train-to-train container transfer process has been performed. It will be shown that the original idea of a full-automatic system without any personnel could not be reached. Due to the variety of working conditions expected, the control system operates in several automation modes, which allow the choice between various levels of automation. Thus the work content can be changed according to the choice of the workers. - Because various companies and institutions developed the system in cooperation, the communication process between the designers turned out to be sometimes complicated and economic limitations influenced the design phases strongly. It will be shown, how - despite of these circumstances - aspects of man-machine-interface design and work content could be included into the design process. Keywords: Hierarchical systems; man-machine systems; work design; computer software Michael Gegick, Laurie Williams, On the design of more secure software-intensive systems by use of attack patterns, Information and Software Technology, Volume 49, Issue 4, April 2007, Pages 381-397, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2006.06.002. (https://www.sciencedirect.com/science/article/pii/S0950584906000802) Abstract: Retrofitting security implementations to a released software-intensive system or to a system under development may require significant architectural or coding changes. These late changes can be difficult and more costly than if performed early in the software process. We have created regular expression-based attack patterns that show the sequential events that occur during an attack. By performing a Security Analysis for Existing Threats (SAFE-T), software engineers can match the symbols of a regular expression to their system design. An architectural analysis that identifies security vulnerabilities early in the software process can prepare software engineers for which security implementations are necessary when coding starts. A case study involving students in an upper-level undergraduate security course suggests that SAFE-T can be performed by relatively inexperienced engineers who are not experts in security. Data from the case study also suggest that the attack patterns do not restrict themselves to vulnerabilities in specific environments. Keywords: Software and system safety; Patterns Stefan Jähnichen, Teaching software engineering—experience from the past, needs for the future, Education and Computing, Volume 8, Issue 4, April 1993, Pages 273-285, ISSN 0167-9287, https://doi.org/10.1016/0167-9287(93)90367-A. (https://www.sciencedirect.com/science/article/pii/016792879390367A) Abstract: The field of software engineering evolved during the last decade from pure (but excellent) programming towards an engineering discipline including managerial, organizational, hardware and even commercial aspects. Nowadays, engineers have to cope with the development of very complex systems which are composed of various components such as software, hardware, interfaces, etc., and which are expected to guarantee robustness, reliability and even correctness for their products. Thus, the width of the discipline poses a variety of problems to the teaching of the field, but sometimes resulting in courses just tackling the surface of those problems. The paper presents a curriculum for a software engineering course and identifies directions for further evolution. Keywords: software engineering; algorithms; methods Kari Rönkkö, Interpretation, interaction and reality construction in software engineering: An explanatory model, Information and Software Technology, Volume 49, Issue 6, June 2007, Pages 682-693, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2007.02.014. (https://www.sciencedirect.com/science/article/pii/S0950584907000171) Abstract: The incorporation of social issues in software engineering is limited. Still, during the last 20 years the social element inherent in software development has been addressed in a number of publications that identified a lack of common concepts, models, and theories for discussing software development from this point of view. It has been suggested that we need to take interpretative and constructive views more seriously if we are to incorporate the social element in software engineering. Up till now we have lacked papers presenting ‘simple’ models explaining why. This article presents a model that helps us better to understand interpretation, interaction and reality construction from a natural language perspective. The concepts and categories following with the model provide a new frame of reference useful in software engineering research, teaching, and methods development. Keywords: Social; Interpretation; Interaction; Indexicality; Communication; Software development; Natural language; Methods development; Management; Software engineering practice George Triantafyllakos, George Palaigeorgiou, Ioannis A. Tsoukalas, Fictional characters in participatory design sessions: Introducing the “design alter egos” technique, Interacting with Computers, Volume 22, Issue 3, May 2010, Pages 165-175, ISSN 0953-5438, https://doi.org/10.1016/j.intcom.2009.12.003. (https://www.sciencedirect.com/science/article/pii/S0953543809001088) Abstract: In recent years, the discourse concerning the relationship between narrative theory – storytelling in general – interactivity, and design is undeniably noteworthy. A significant part of this discourse concerns the use of fictional characters in design. Fictional characters have been used as user representatives, either substituting actual users or supporting idea generation, and their foremost objective is to facilitate the identification of user needs and goals and to support the development of detailed and comprehensive scenarios. Motivated by the aforementioned ongoing discourse and inspired by relevant approaches in the use of fictional characters in design, we aim to investigate the applicability and effectiveness of their use as a creative technique in participatory design sessions. We present a novel approach to using fictional characters in collaborative design of educational software with students, one that asks the participants for the formation and use of their own fictional characters – we introduce the term “design alter egos” – as a means to eliciting requirements and design ideas. In order to evaluate our approach, we conducted 20 collaborative design sessions with the participation of 94 undergraduate university students (aged 19–24) for eliciting requirements for the design of an ideal course website. The analysis of the results suggests that the design alter egos technique liberated the majority of the students from the fear of straightforwardly exposing themselves, supported and enhanced their introspection, stimulated their creativity, and helped to establish an informal and constructive atmosphere throughout the design sessions. We suggest the use of design alter egos as an engaging and effective supportive technique for co-designing educational software with students. Keywords: Design alter egos; Fictional characters; Design in imaginary landscapes; Participatory design; Collaborative software design; Student-centered design David Budgen, Peter Henderson, Chic Rattray, Academic/industrial collaboration in a postgraduate MSc course in Software Engineering, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 261-266, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90072-1. (https://www.sciencedirect.com/science/article/pii/0164121289900721) Abstract: This paper outlines the organization of an MSc in Software Engineering that has been set up as a specialist conversion course for graduates who have had some experience of computer programming. The most distinctive feature of the program is that this degree involves the participation of an industrial partner in providing some of the teaching and a period of industrial placement. Our experiences with the academic and practical aspects of such a structure have been included.††This paper is an extension of an earlier report [1] and as we did then, we should explain to readers in the U.S.A. that the British tend to use the term course when referring to both a degree programme and also a course unit. Luqi, Manfred Broy, Preface: Volume 25, Electronic Notes in Theoretical Computer Science, Volume 25, 1999, Pages 145-146, ISSN 1571-0661, https://doi.org/10.1016/S1571-0661(05)80554-7. (https://www.sciencedirect.com/science/article/pii/S1571066105805547) Abstract: Software Engineering to our Planning Horizon The Army Research Office, National Science Foundation, Office of Naval Research, and the Defense Advanced Research Projects Agency sponsored the 1998 Monterey Workshop on Engineering Automation for Computer Based Systems. This workshop is the 6th in a series of international workshops with the general theme of increasing the practical impact of formal methods for software and systems engineering. The workshop took place in Carmel, California late 1998, hosted by the Naval Postgraduate School. Since 1990, the previous workshops in the series focused on real-time and concurrent systems, software merging and slicing, software evolution, software architecture, and requirements targeting software. This workshop focused on engineering automation. The objectives of the workshops are to encourage interaction between the research and engineering communities, exchange recent results, assess their significance and encourage transfer of relevant results to practice, communicate current problems in engineering practice to researchers, and help focus future research on directions that address pressing practical needs. Over the past years, we have witnessed a slow but steady decrease in the gap between the theoretical and practical sides of the software engineering community. We hope that this trend will continue and will accelerate improvements in the state of software engineering practice and theory. Software problems have been quite visible to the public due to spectacular disasters in space missions or telephone black outs and are receiving increasing attention with the nearing Y2K deadline. It is a good time to demonstrate concrete improvements in our discipline. The continued doubling of computing speed and memory capacity every 18 months implies that the only constancy for large distributed systems, technology, tactics and doctrine may well be the idea that change is always inevitable. The dynamic aspect of systems is not supported by current practice and is seldom emphasized in current research. Software evolution research is extremely important for achieving modifiable and dependable systems in the future. Improved methods for reengineering are also needed to bring legacy systems to the condition where they can benefit from improvements in software evolution technology. Thirty years ago, when the term software engineering was coined, there was lack of theoretical foundation for many practical concepts in computing. That is no longer true. A solid body of foundational work is available now that addresses many challenging issues related to software and computing, including specification techniques for systems and data, logical calculi for concurrent, distributed, and real-time systems, logical concepts related to interactive systems, and formal models of programming language semantics with a variety of inference systems. The challenge is to put these results to work, to develop theory that better supports engineering needs, and to improve practice. This will require cooperation and a concerted effort from both theoreticians and practitioners. We will need advances in education and improvements in theoretical approaches to meet the demand of practical engineering for computer software. To be attractive to practitioners, formal methods, mathematical foundations and automated engineering tools need to provide return on investment. These approaches must be cost effective to successfully compete with other development methods, and the benefits they provide in terms of software quality must have sufficient economic value to justify investment in them. These goals require some uncomfortable changes in the research community. Mathematical elegance is not enough for the success of an engineering theory: applicability, tractability, and ease of understanding are often more important in practice than logical completeness or conceptual elegance of the principles that guarantee the soundness of the methods. We must carefully separate the application of mathematics to demonstrate the soundness of a formal software model or to construct automated tools for engineers from the formal models that will be used “by engineers as design representations”. The formal aspects of computing cannot be studied in isolation if we are to have practical impact. The different aspects of technical, educational, and management issues are so closely intertwined in software engineering practice that it is risky and ineffective to study and develop them in isolation if practical applicability is a prominent goal. This puts interdisciplinary requirements on researchers and lends importance to interactions between experts from different specialties, such as those promoted by this workshop. We have collected some excellent papers for the workshop. These articles are written by internationally renowned contributors from both academia and industry that examine current best practices and propose strategies for improvement, as well as a summary of the high points of the discussions at the workshop. The broadest range of expert opinion and views were represented. Members of the academic, government, military and commercial world came to share their vision, insight and concerns. By synthesizing the expertise of these communities we hope to gain significant insight into the problems and solutions. The discussions ranged beyond the narrow confines of software and mathematics, to address engineering of systems containing hardware and people as well as software, and related issues that include requirements elicitation, management, and engineering education. Discussions at the workshop addressed technical advances in mature areas, such as a new decision procedure for a queue data type and novel types of model checking, as well as ideas for new directions, such as lightweight inference and co-algebraic models for interactive systems. The workshop helped to reduce the gap between theory and practice, and to recharge the research community to address problems of immediate concern. Workshop attendees identified and discussed both the technologically dependent and technologically independent trends within the engineering automation of computer based systems for the near term and out to our planning horizon. It is our pleasure to thank the workshop advisory, program and local arrangements committees, and the workshop sponsors, NSF, ONR, DARPA, and especially ARO, for their vision of a principled engineering solution for software and for their many-year tireless effort in supporting a series of workshops to bring everyone together. GD Alford, Teaching computer-aided engineering on the BBC microcomputer, Microprocessors and Microsystems, Volume 10, Issue 6, July–August 1986, Pages 313-324, ISSN 0141-9331, https://doi.org/10.1016/0141-9331(86)90271-1. (https://www.sciencedirect.com/science/article/pii/0141933186902711) Abstract: Teaching computer-aided engineering (CAE) to students has in the past been based almost exclusively on mainframe computers supporting a number of terminals. Some of this work can now be performed on microcomputers, as their relatively simple I/O facilities enable them to be used for data acquisition and interactive machine control applications. The paper describes the development of general-purpose hardware and software designed to adapt the Acom BBC microcomputer for demonstrating CAE applications in a teaching environment. Student experiences with the system are discussed. Keywords: microsystems; CAE; teaching; BBC microcomputer Mariano G. Fernandez, Sumit Ghosh, Ddbx-LPP: A dynamic software tool for debugging asynchronous distributed algorithms on loosely coupled parallel processors, Journal of Systems and Software, Volume 22, Issue 1, July 1993, Pages 27-43, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(93)90120-M. (https://www.sciencedirect.com/science/article/pii/016412129390120M) Abstract: It is generally accepted in the parallel processing community that powerful yet flexible debuggers are indispensable for the efficient programming of complex distributed synchronous and asynchronous algorithms on loosely coupled parallel processors. Traditional debugging systems, including POKER, permit a user to start, stop, and single-step a parallel program executing on a parallel processor while observing the successive changes of the traced variables and labels. These debuggers are limited in that the user must specify the list of variables and labels to be traced through the declaration section of each routine. As a result, the user may not alter the contents of this set once program execution has been initiated. More recently, debuggers such as ndb and dbxtool claim dynamic debugging support but are limited by clumsy user interfaces. While ndb works within a single window and requires the user to type commands, dbxtool is a simple collection of uniprocessor debuggers with no explicit coordination. PROVIDE claims to use graphical tools for debugging but is limited to a simplified programming language. Furthermore, both ndb and dbxtool are proprietary; few details, if any, on their software engineering design are available in the literature. This article details the software engineering issues in the design and implementation of an actual distributed dynamic runtime software debugger, Ddbx-LPP, that permits the user to view any global variable, structure, and parameter during program execution at any node of a parallel processor system. The system is exclusively mouse driven for relatively easy debugging. The user may insert breakpoints corresponding to any source code line, either before initiating execution or when program execution is temporarily suspended at a breakpoint. Furthermore, when the program, in the course of execution, experiences a nonrecoverable error, its execution is temporarily suspended and control is transferred to the user in a manner identical to the case of a deliberately inserted breakpoint. Although further execution is prohibited, Ddbx-LPP permits the user to view variables and structures to determine the cause of the error. Ddbx-LPP's unique ability may be credited to its significant analysis of the object code and symbol table, generated as a result of compilation under the “-g” option, both before and during the actual execution of the program. In contrast to POKER, which requires a sequential programming environment, Ddbx-LPP may function with a user program written in C for any loosely coupled parallel processor. Ddbx-LPP is superior to user-inserted “printf” statements to print out the values of variables and structures during execution because 1. 1) to print all variables and structures would require an overwhelming number of printf statements, and 2. 2) to insert new printf statements would mean program recompilation. Ddbx-LPP has been implemented on the ARMSTRONG system at Brown University and is equally applicable to any loosely coupled parallel processor system. Wee Wee Sim, Peggy Brouse, Towards an Ontology-based Persona-driven Requirements and Knowledge Engineering, Procedia Computer Science, Volume 36, 2014, Pages 314-321, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2014.09.099. (https://www.sciencedirect.com/science/article/pii/S1877050914013489) Abstract: Abstract This paper presents the development of ontologies for OntoPersonaURM, an Ontology-Based Persona-Driven User Requirements Modeling model, with goal of providing insights into the construction of ontologies for explicit specifications of the concept of persona in representing users’ knowledge and characteristics, and the concepts of viewpoints, goals, scenarios, tasks, and requirements for Web application domain. OntoPersonaURM is composed of three interrelated ontologies: Persona Ontology, Behavioral-GST (Behavioral-Goal-Scenario-Task) Ontology, and Requirements Ontology. The objectives are to examine 1) how the concept of persona, in the context of the concepts of viewpoint, goal, scenario, task, and requirement, may be integrated in a unified environment and 2) how the concepts and their relationships may be specified ontologically. The explicit specifications of concepts and their relationships in the developed ontologies serve to establish a knowledge repository and foster common understanding of users’ needs and behaviors among developers and stakeholders during the requirements analysis and modeling activity. We provide a running example of a university course registration web application domain to demonstrate the OntoPersonaURM model, consisting of UML class diagrams and explicit specifications of the concepts of the ontologies in the Protégé-Frames ontology knowledge management environment. Keywords: Ontology; Persona; User Profile; User Modeling; Requirements Engineering; Knowledge Engineering; Concept Development Process Darrell G. Linton, Maria A. Cianci, Computer-aided software engineering and Ada -- The technological marriage of the decade, Computers & Industrial Engineering, Volume 17, Issues 1–4, 1989, Pages 542-545, ISSN 0360-8352, https://doi.org/10.1016/0360-8352(89)90120-4. (https://www.sciencedirect.com/science/article/pii/0360835289901204) Abstract: Computer-Aided Software Engineering (CASE) tools and Ada language compilers are now available for both mainframes and Personal Computers (PCs). Although CASE methodologies have existed since the early 1970s and the use of Ada has been required by the Department of Defense since 1985, only recently have CASE and Ada become of serious interest to engineers. This paper identifies the capabilities of PC-based CASE software, the reasons for combining CASE with Ada, and the impact of CASE and Ada on research and teaching in the areas of Industrial and Computer Engineering. Based on the authors' first-hand experience, the advantages and/or disadvantages of several CASE tools and Ada environments will be discussed. Other topics addressed include the meanings of related terminology (e.g., object-oriented programming, design methodologies) and the future ramifications of CASE and Ada on the software engineering community. Stephen de Vries, Software Testing for security, Network Security, Volume 2007, Issue 3, March 2007, Pages 11-15, ISSN 1353-4858, https://doi.org/10.1016/S1353-4858(07)70027-2. (https://www.sciencedirect.com/science/article/pii/S1353485807700272) Abstract: Software testing in the form of unit, integration and acceptance tests are key phases of many development methodologies and particularly favoured by agile development1 proponents. These tests serve to prevent regressions, assist refactoring and of course prove that the software meets the functional requirements. Software tests provide a natural, but often overlooked, opportunity for integrating security early on in the software development lifecycle (SDLC) where identified vulnerabilities can be corrected while development is still ongoing. Tanja Bipp, Andreas Lepper, Doris Schmedding, Pair programming in software development teams – An empirical study of its benefits, Information and Software Technology, Volume 50, Issue 3, February 2008, Pages 231-240, ISSN 0950-5849, https://doi.org/10.1016/j.infsof.2007.05.006. (https://www.sciencedirect.com/science/article/pii/S0950584907000596) Abstract: We present the results of an extensive and substantial case study on pair programming, which was carried out in courses for software development at the University of Dortmund, Germany. Thirteen software development teams with about 100 students took part in the experiments. The groups were divided into two sets with different working conditions. In one set, the group members worked on their projects in pairs. Even though the paired teams could only use half of the workstations the teams of individual workers could use, the paired teams produced nearly as much code as the teams of individual workers at the same time. In addition, the code produced by the paired teams was easier to read and to understand. This facilitates finding errors and maintenance. Keywords: Pair programming; Empirical software engineering; Quality of software Hossein Saiedian, Bruce W. Weide, The new context for software engineering education and training, Journal of Systems and Software, Volume 74, Issue 2, 15 January 2005, Pages 109-111, ISSN 0164-1212, https://doi.org/10.1016/j.jss.2003.09.023. (https://www.sciencedirect.com/science/article/pii/S0164121203002905) D Andrews, Software engineering education in the 21st century, Information and Software Technology, Volume 41, Issue 14, 5 November 1999, Pages 933-936, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(99)00067-1. (https://www.sciencedirect.com/science/article/pii/S0950584999000671) Ulrich Flemming, Halil Erhan, Ipek Özkaya, Object-oriented application development in CAD: a graduate course, Automation in Construction, Volume 13, Issue 2, March 2004, Pages 147-158, ISSN 0926-5805, https://doi.org/10.1016/j.autcon.2003.09.009. (https://www.sciencedirect.com/science/article/pii/S0926580503001018) Abstract: The programming languages typically offered by CAD systems for third-party application developers were procedural or functional. A major shift is currently occurring in that new versions of commercial CAD software will support object-oriented programming languages for application development. Developers who wish to take advantage of this new kind of environment must undergo a considerable cognitive “retooling” and adopt new software engineering strategies. We describe a graduate course that aims at introducing students to effective object-oriented development strategies, especially use case-driven development and the tools provided by the Unified Modeling Language (UML). Students gained experience with these tools by forming, together with the instructors, a single development team writing an application on top of MicroStation/J using JMDL as programming language. The paper describes the instructors' experience with this approach. Keywords: Object-oriented application; CAD; Unified Modeling Language Anita Herrmann, Thomas Schöning, Standard Telemetry Processing – an object oriented approach using Software Design Patterns, Aerospace Science and Technology, Volume 4, Issue 4, June 2000, Pages 289-297, ISSN 1270-9638, https://doi.org/10.1016/S1270-9638(00)00138-3. (https://www.sciencedirect.com/science/article/pii/S1270963800001383) Abstract: While rapidly increasing the software safety and reliability requirements of space-born satellite missions the reduction of processing software development costs is of global interest. To achieve the mission goals both the quality of the applied program design and implementation have a remarkable influence. Because remote sensing data processing software is highly complex and costs a lot to develop, reusing software helps to economize payments significantly and creates maximum benefit to meet quality needs. For the small satellite mission CHAMP the Packet Telemetry Recommendation of the European Space Agency ESA gave the main idea to develop an object-oriented program design of an on-ground data processing system. Due to several on-board satellite instruments (sensors, optical cameras etc.) the amount and the structure of the remote sensing data is completely different. This, of course, requires satellite specific converting algorithms, and with the above mentioned recommendations only standard processing steps can be performed. However, when considering future satellite projects the software design to use must be tailored to the mission specific requirements of the application data handling only. This paper shall report the application of object-oriented software design within the CHAMP project. Based on the Telemetry Packet Standards, the hierarchy of abstract classes is a joined application of the Design Patterns `abstract factory' and `facade'. The growth of flexibility as well as the limitations of their use are discussed with implementation examples. The reuse of both the software architecture and the abstract base classes is also planned for the small satellite mission BIRD. Keywords: telemetry data processing; object-oriented programming; software design patterns; abstract factory pattern; facade patternüsselwörter; Satellitendatenverarbeitung; objekt-orientiertes Programmieren; Software Enwurfsmuster; Abstrakte Fabrik Muster; Fassadenmuster Hossein Saiedian, Software engineering education and training for the next millennium, Journal of Systems and Software, Volume 49, Issues 2–3, 30 December 1999, Pages 113-115, ISSN 0164-1212, https://doi.org/10.1016/S0164-1212(99)00082-5. (https://www.sciencedirect.com/science/article/pii/S0164121299000825) Daniel T Joyce, Experience with a fourth-generation language system for software development: An educator's perspective, Journal of Systems and Software, Volume 25, Issue 2, May 1994, Pages 193-200, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(94)90006-X. (https://www.sciencedirect.com/science/article/pii/016412129490006X) Abstract: The author taught software engineering for many years before practicing it in a “real world” project. The project used Paradox, a fourth-generation language system for PCs. This article describes the project and why it is considered “software development,” although it used a packaged data base system. Included are insights derived from the experience, emphasizing those things that were surprising, the things for which the author was not prepared by just studying and teaching the classic software engineering papers and textbooks. B.J. Thomasson, M.B. Ratcliffe, L.A. Thomas, Improving the tutoring of software design using case-based reasoning, Advanced Engineering Informatics, Volume 20, Issue 4, October 2006, Pages 351-362, ISSN 1474-0346, https://doi.org/10.1016/j.aei.2006.07.002. (https://www.sciencedirect.com/science/article/pii/S1474034606000462) Abstract: Judging by results, the methods undertaken to teach software development to large classes of students are flawed; too many students are failing to grasp any real understanding of programming and software design. To address this problem the University of Wales, Aberystwyth has developed VorteX, an interactive collaborative design tool that captures the design processes of novice students, provides a diagnosis system capable of interpreting the students’ work, and advises on their design process. This paper provides an overview of VorteX, its capabilities and use, and explains how the case-based system identifies redundancies in the storage of student designs and reduces data volume. The paper describes how equivalence maps merge similar classes to reduce the design structure possibilities, how snippets eliminate the replication of components and how abstract snippets represent the design intent of students in a minimalist form. Finally it concludes with comments on the student experience of the VorteX case-based reasoning assistant. Keywords: Software design; Case-based reasoning; Design capture; Design replays; Intelligent tutor Prof.Heng-Li Yang, Adoption and implementation of CASE tools in Taiwan, Information & Management, Volume 35, Issue 2, 8 February 1999, Pages 89-112, ISSN 0378-7206, https://doi.org/10.1016/S0378-7206(98)00081-0. (https://www.sciencedirect.com/science/article/pii/S0378720698000810) Abstract: The aim of the research discussed here was to understand computer-aided software engineering (CASE) tool usage in Taiwan. Through a literature review, we developed two questionnaires – one for general respondents, the other for teachers and CASE agents. After pre-testing, 786 questionnaires were mailed out and 226 effective responses were obtained after two follow-up letters. Factor analyses were used to condense factors from `severity of critical problems in system development', `severity of perceived problems in CASE usage', `attitude toward CASE' and `CASE implementation success determinants'. Several external variables were considered in exploring their possible influence as well as the attitude and organizational features of the organizations that successfully used CASE. Path analyses were used to test an attitude model of CASE adoption and implementation success determinants. The results show that `the perceived problems in CASE tools' had no statistically significant influence on `attitude toward CASE' and very little influence on `perceived CASE improvement for system development critical problems'. In addition, we found that `methodology use' (including the usage before CASE adoption and consistency with the methodology supported by CASE) was the only statistically significant CASE implementation success determinant. Using only a `methodology use' variable could provide a way to discriminate the successful adopter from relatively unsuccessful adopter with a 75% correct classification rate. Keywords: Computer-aided software engineering (CASE); CASE adoption; CASE success; CASE implementation Liesbeth Dusink, Larry Latour, Controlling functional fixedness: the essence of successful reuse, Knowledge-Based Systems, Volume 9, Issue 2, April 1996, Pages 137-143, ISSN 0950-7051, https://doi.org/10.1016/0950-7051(95)01025-4. (https://www.sciencedirect.com/science/article/pii/0950705195010254) Abstract: A common rule of thumb in making components reusable is to ‘make components generic’. Unfortunately, this is not always an easy task, due in large part to the tendency of software engineers to design implementations in ways that are functionally fixed. That is, software engineers tend to design (a) in a way that solves the particular problem at hand but is not easily generalizable, and (b) in such a way that opportunities to reuse existing components are not easily ‘seen’. The paper suggests how to manually uncommit design decisions from functionally fixed designs in such a way that their essence is kept. This uncommit process is done with the support of (a) reasoning by analogy, (b) statement of implications, (c) challenging of assumptions, and (of course) (d) previous experience/knowledge about both the problem domain and about general software engineering principles. It is shown how this method of working not only helps in designing a solution for a problem class instead of for a specific problem, but also how it helps in (a) seeing how a component can be applied, and (b) building (more reusable) components rather than (more) reusable components. The goal is thus to control functional fixedness in the design process both for and with reuse. Keywords: Reusability; Functional fixedness Neil C. Ramiller, Constructing safety: System designs, system effects, and the play of heterogeneous interests in a behavioral health care setting, International Journal of Medical Informatics, Volume 76, Supplement 1, June 2007, Pages S196-S204, ISSN 1386-5056, https://doi.org/10.1016/j.ijmedinf.2006.05.025. (https://www.sciencedirect.com/science/article/pii/S1386505606001468) Abstract: Objective This paper considers the utility of actor-network theory as a basis for uncovering the mutual interdependencies between system design and system impact in an evolving project, and for exploring the implications that these interdependencies hold for the production of safety in behavioral health care. Methods Drawing on a field study of a systems project in a human-services firm, the paper applies key concepts from actor-network theory in the analysis of a design crisis that emerged during the course of the project. Results Actor-network theory provides a compelling framework in this situation for identifying the diverse interests involved, revealing their complex interactions, and illuminating the importance of the emerging system as an organizational actor in its own right. Conclusion Actor-network theory shows promise for use in other analyses concerned with the role of information technology in the construction of safety in health care settings. Keywords: Information systems; Software design; Safety; Behavioral medicine H. Conrad Cunningham, Yi Liu, Cuihua Zhang, Using classic problems to teach Java framework design, Science of Computer Programming, Volume 59, Issues 1–2, January 2006, Pages 147-169, ISSN 0167-6423, https://doi.org/10.1016/j.scico.2005.07.009. (https://www.sciencedirect.com/science/article/pii/S0167642305000900) Abstract: All programmers should understand the concept of software families and know the techniques for constructing them. This paper suggests that classic problems, such as well-known algorithms and data structures, are good sources for examples to use in a study of software family design. The paper describes two case studies that can be used to introduce students in a Java software design course to the construction of software families using software frameworks. The first is the family of programs that use the well-known divide and conquer algorithmic strategy. The second is the family of programs that carry out traversals of binary trees. Keywords: Software family; Software framework; Hot spot; Design pattern; Divide and conquer; Tree traversal G Conte, D DelCorso, M Giordana, F Gregoretti, V Pozzolo, MIL project: a microcomputer integrated laboratory, Microprocessors and Microsystems, Volume 4, Issue 2, March 1980, Pages 49-52, ISSN 0141-9331, https://doi.org/10.1016/0141-9331(80)90363-4. (https://www.sciencedirect.com/science/article/pii/0141933180903634) Abstract: The introduction of microprocessors has meant a tremendous change in electronic technology and it is a common feeling that, with the continuous improvement of LSI and VLSI technology, the future will evolve at least at the same rate. This revolution produced great changes not only in electronic products themselves, but deeply affected research laboratories, production plants and maintenance services of all industries manufacturing electronic equipment or incorporating electronics in their final products. The main effect was a great demand for people with knowledge coming from different and previously separated fields: computer science, software engineering and digital electronic design. The first answer to the need of a quick updating for designers came from the market itself and everyone knows how many one-day courses were offered on ‘everything you must know about microprocessors’. A more long-term answer is the responsibility of the universities and of the technical schools that must prepare the new people for the new technological wave. Yael Dubinsky, Orit Hazzan, Using a role scheme to derive software project metrics, Journal of Systems Architecture, Volume 52, Issue 11, November 2006, Pages 693-699, ISSN 1383-7621, https://doi.org/10.1016/j.sysarc.2006.06.013. (https://www.sciencedirect.com/science/article/pii/S1383762106000713) Abstract: Roles’ playing is common in our lives. We play different roles with our family, at work as well as in other environments. Role allocation in software development projects is also accepted though it may be implemented differently by different software development methods. In a previous work [Y. Dubinsky, O. Hazzan, Roles in agile software development teams, in: 5th International Conference on Extreme Programming and Agile Processes in Software Engineering, 2004, pp. 157–165] we have found that personal roles may raise teammates’ personal accountability while maintaining the essence of the software development method. In this paper we present our role scheme, elaborate on its implementation and explain how it can be used to derive metrics. We illustrate our ideas by data gathered in student projects in the university. Keywords: Role scheme; Project metrics; Project-based course Francesco Pinciroli, Carlo Combi, Giuseppe Pozzi, A database schema for public-domain medical software, Computers in Biology and Medicine, Volume 24, Issue 4, July 1994, Pages 243-254, ISSN 0010-4825, https://doi.org/10.1016/0010-4825(94)90021-3. (https://www.sciencedirect.com/science/article/pii/0010482594900213) Abstract: The quantity of public-domain medical software available is huge, and a classification schema may be therefore helpful. We developed a schema that includes identification data (name of the software, author, etc.), description (hardware and software requirements), classification (software category, application domain, etc.) and evaluation data (external quality and internal quality factors). The schema was tested on the public-domain software available at the SCAMC® meetings (about 36 Mb). We also classified the software by employing students from a master course in computer science and medical informatics. We stored the high quantity of information collected in a database we developed using Paradox™. Keywords: Public-domain software; Classification schema; Quality evaluation; Software engineering; SCAMC® meetings Joon-Sang Lee, Doo-Hwan Bae, An aspect-oriented framework for developing component-based software with the collaboration-based architectural style, Information and Software Technology, Volume 46, Issue 2, 1 February 2004, Pages 81-97, ISSN 0950-5849, https://doi.org/10.1016/S0950-5849(03)00111-3. (https://www.sciencedirect.com/science/article/pii/S0950584903001113) Abstract: Component-based development (CBD) technique for software has emerged to fulfill the demand on the reuse of existing artifacts. In comparison to traditional object-oriented techniques, CBD can provide more advanced abstraction concepts such as subsystem-level reusability, gross structure abstraction, and global control flow abstraction. Unfortunately, existing software development techniques are not mature enough to make it come true that components developed in the third party can be used in a highly flexible way. It is notable that there are certain kinds of software requirements, such as non-functional requirements, that must be implemented cross-cutting multiple classes, largely losing the modularity in object-oriented design and implementation code. Therefore, it is not easy that components are reused without consideration of their low-level implementation details. In this article, we propose Aspect-Oriented Development Framework (AODF) in which functional behaviors are encapsulated in each component and connector, and particular non-functional requirements are flexibly tuned separately in the course of software composition. To support the modularity for non-functional requirements in component-based software systems, we devise Aspectual Composition Rules (ACR) and Aspectual Collaborative Composition Rule (ACCR). Note that AODF makes component-based software built to provide both supports of modularity and manageability of non-functional requirements such as synchronization, performance, physical distribution, fault tolerance, atomic transaction, and so on. With the Collaboration-Based architectural style, AODF explicitly enables to deal with non-functional requirements at the intra-component and inter-component levels. Keywords: Component-based development; Software architecture; Aspect-oriented programming; Collaboration-based design; Non-functional requirements Y. Papadopoulos, J. McDermid, R. Sasse, G. Heiner, Analysis and synthesis of the behaviour of complex programmable electronic systems in conditions of failure, Reliability Engineering & System Safety, Volume 71, Issue 3, March 2001, Pages 229-247, ISSN 0951-8320, https://doi.org/10.1016/S0951-8320(00)00076-4. (https://www.sciencedirect.com/science/article/pii/S0951832000000764) Abstract: This paper introduces a new method for safety analysis which modifies, automates and integrates a number of classical safety analysis techniques to address some of the problems currently encountered in complex safety assessments. The method enables the analysis of a complex programmable electronic system from the functional level through to low levels of its hardware and software implementation. In the course of the assessment, the method integrates design and safety analysis and harmonises hardware safety analysis with the hazard analysis of software architectures. It also introduces an algorithm for the synthesis of fault trees, which mechanises and simplifies a large and traditionally problematic part of the assessment, the development of fault trees. In this paper, we present the method and discuss its application on a prototypical distributed brake-by-wire system for cars. We argue that the method can help us rationalise and simplify an inherently creative and difficult task and therefore gain a consistent and meaningful picture of how a complex programmable system behaves in conditions of failure. Keywords: Automated safety analysis; Mechanical fault tree synthesis; Software hazard analysis; Safety cases Jeffrey J. McConnell, Computer graphics education: Issues from multiple perspectives, Computers & Graphics, Volume 19, Issue 2, March–April 1995, Pages 331-334, ISSN 0097-8493, https://doi.org/10.1016/0097-8493(94)00161-Q. (https://www.sciencedirect.com/science/article/pii/009784939400161Q) Abstract: Much of computer graphics today neglects the fact that there is a receiver of the information being presented. Therefore, teaching computer graphics must not be done with just a single focus. Rather, concepts related to hardware, software, design, color, and aesthetic issues must also be addressed. Further, students should be made aware of cultural and gender differences related to color, design, and information presentation. These issues must be addressed for the presentation not to get in the way of the message being presented. Charles Ume, Marc Timmerman, Integrated hardware and software designs in mechatronics laboratory courses, Mechatronics, Volume 4, Issue 5, August 1994, Pages 539-549, ISSN 0957-4158, https://doi.org/10.1016/0957-4158(94)90015-9. (https://www.sciencedirect.com/science/article/pii/0957415894900159) Abstract: Concurrent design is a concept of engineering design where all aspects of a design problem are considered simultaneously. Mechatronics is a specialization of concurrent design to problems dealing with the integration of mechanical and electrical components into specialized modular devices. This paper illustrates this concept of design through student projects drawn from the files of Mechanical Engineering microprocessor design courses at the Georgia Institute of Technology. The design projects show the integration of mechanical devices, controls concepts, electronic hardware, and microprocessor software using the concurrent design philosophy. Bjarne Belhage, Improvement of spatial understanding in art, Education and Computing, Volume 5, Issues 1–2, 1989, Pages 29-34, ISSN 0167-9287, https://doi.org/10.1016/S0167-9287(89)80007-X. (https://www.sciencedirect.com/science/article/pii/S016792878980007X) Abstract: In art, there are problems with the representation of space in the plane of a picture. Solutions to these problems from ancient cultures are described. Connections between Perception Psychology and geometrical drawings made by children are discussed, as are the development of spatial elements in children's drawings, different representations of linear perspective, isometry, ‘Dürer's Way’, central and X-perspective. The paper goes on to describe an educational software design course in Scandinavia. The design process is presented by an account of the methodological and pedagogical considerations that have led to the implementation of the spatial drawing program ‘RUM’. Key Words: Market place design; Perspective in art; ‘RUM’; Software design method; Software tool; Three-dimensional drawing John W. Brackett, The Boston University software engineering graduate program: Continuing education through interactive television, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 267-269, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90073-3. (https://www.sciencedirect.com/science/article/pii/0164121289900733) Claire H Jarvis, GEO_BUG: a geographical modelling environment for assessing the likelihood of pest development, Environmental Modelling & Software, Volume 16, Issue 8, December 2001, Pages 753-765, ISSN 1364-8152, https://doi.org/10.1016/S1364-8152(01)00040-8. (https://www.sciencedirect.com/science/article/pii/S1364815201000408) Abstract: This paper describes software designed to explore pest phenology (development) over space and time. The framework presented links sequences of interpolated daily maximum and minimum temperatures with a variety of process-based phenology and accumulated temperature models. The flexibility offered by this approach is demonstrated using examples of gridded maps of pest phenology on target dates, graphs of the sequences of pest development at individual locations and assessments of error in the predicted dates over the course of a model run. Finally, the potential application of the software in support of agricultural management systems, policy development and integrated research is discussed. Keywords: Interpolation; Daily temperatures; Geographical information systems (GIS); Pest phenology John R. Crookal, Education for CIM, CIRP Annals, Volume 36, Issue 2, 1987, Pages 479-494, ISSN 0007-8506, https://doi.org/10.1016/S0007-8506(07)60750-1. (https://www.sciencedirect.com/science/article/pii/S0007850607607501) Abstract: Summary Computer Integrated Manufacturing (CIM) is probably the most challenging concept which has faced manufacturing. It is multi-disciplinary by its nature. It requires the latest and most advanced computer technology, software engineering skills, and networking. It involves new manufacturing control methodologies, Artificial Intelligence, as well as essential data and communications compatibility questions. In principle as well as in practice, CIM must reach into every relevant aspect of a manufacturing business. It is equally concerned within the design and product innovation as with manufacturing control, business and financial strategy, and indeed supply and distribution management, etc. James R. Comer, David J. Rodjak, Software engineering education at Texas Christian University: Adapting a curriculum to changing needs, Journal of Systems and Software, Volume 10, Issue 4, November 1989, Pages 235-244, ISSN 0164-1212, https://doi.org/10.1016/0164-1212(89)90069-1. (https://www.sciencedirect.com/science/article/pii/0164121289900691) Abstract: In response to the need for skilled software engineers, Texas Christian University, in the Fall of 1978, established a Master's Degree in Software Engineering, the first such degree program of its kind in the country. Because of external pressure, prompted by the absence of an engineering college at TCU, the program was renamed Master's of Software Design and Development (MSDD) in 1980. After three years of experience with the MSDD program, the curriculum was revised in 1981 to reflect the changing needs of the software engineering profession. This revised curriculum, currently in place at Texas Christian University, is described and evaluated. Avenues of future curriculum expansion are explored. Sarah Gordon, A short course in antivirus software testing: seven simple rules for evaluating tests, Network Security, Volume 2004, Issue 6, June 2004, Pages 17-18, ISSN 1353-4858, https://doi.org/10.1016/S1353-4858(04)00094-7. (https://www.sciencedirect.com/science/article/pii/S1353485804000947) Abstract: Antivirus software tests are important when selecting antivirus software. However, there are many different tests, and interpreting the results can be challenging. Additionally, the needs of the corporate customer and home user differ, and it is important to understand these differences in order to evaluate antivirus software tests critically.