JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Describing Activities to Verify Artifacts(Documents and Program) in Software R&D
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Describing Activities to Verify Artifacts(Documents and Program) in Software R&D
Dashbalbar, Amarmend; Lee, Eun-Chul; Lee, Jung-Won; Lee, Byeongjeong;
  PDF(new window)
 Abstract
In software R&D artifacts including documents and program code are produced. There are two kinds of software R&D artifacts: Research artifacts and development artifacts. Research artifacts include software R&D planning document, annual report, final report, research note and so on. Software development artifacts include software requirements, software design description, testing plan, test report, and program code. It is important to verify the documents and to test code to check the direction of the R&D. Moreover, we should check relationships as such completeness and consistency between research and development artifacts. The verification and testing help project manager and researchers understand what they do during software projects. Therefore, in this study, we present a process to verify documents and program in software R & D. In the process we check documents produced in software R&D and test program code. We describe the process by using Essence elements including alpha, activity, and competency. We present a case study to show the effectiveness of the process.
 Keywords
Software R&D;Validation Activities;Essence;Artifact Testing;
 Language
English
 Cited by
 References
1.
I. Jacobson, S. Huangb, M. Kajko-Mattssonc, P. McMahond, and E. Seymoure, "Semat-three year vision," Programming and computer software, vol. 38, no. 1, pp.1-12, 2012. http://dx.doi.org/10.15514/syrcose-2011-5-inv crossref(new window)

2.
I Jacobson, P. Ng, P. McMahon, I. Spence and S Lidman, "The essence of software engineering: the SEMAT kernel," Queue - Networks, vol. 10, no. 10, 2012. http://dl.acm.org/citation.cfm?id=2389616

3.
T. Sedano, and P. Cecile, "State-based Monitoring and Goal-driven Project Steering: Field Study of the SEMAT Essence Framework," In Proc. of the 36th International Conference on Software Engineering, pp.325-334, 2014. http://dx.doi.org/10.1145/2591062.2591155 crossref(new window)

4.
I. Burnstein, A. Homyen, R. Grom, and CR Carlson, "A model to assess testing process maturity," Crosstalk the Journal of Defense Software Engineering, vol. 11, no. 11, pp.6-30, 1998. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.434.1067&rep=rep1&type=pdf

5.
T. Ericson, A. Subotic, and S. Ursing. "TIM - A Test Improvement Model," Software Testing Verification and Reliability, vol. 7, no. 4, pp. 229-246, 1997. http://dx.doi.org/10.1002/(sici)1099-1689(199712)7:4<22 9::aid-stvr149>3.3.co;2-d crossref(new window)

6.
I. Burnstein, S. Taratip, and C. Robert, "Developing a testing maturity model for software test process evaluation and improvement," In Proc. of International Test Conference, pp. 581-589, 1996. http://dx.doi.org/10.1109/test.1996.557106 crossref(new window)

7.
E. van Veenendaal, J. Jaap Cannegieter, "Test Maturity Model Integration (TMMi) Results of the first TMMi benchmark - where are we today?", pp.3, Euro Star Software Testing Community, 2013.

8.
E. van Veenendaal, R. Grooff and R. Hendriks, "Test Process Improvement using TMMi," Testing Experience: The Magazine for Professional Testers, vol. 3, no. 19, pp.21-25, 2008. http://www.erikvanveenendaal.nl/NL/files/Test%20Proce ss%20Improvement%20using%20TMM(i).pdf

9.
P. Ng, and S. Huang, "Essence: A framework to help bridge the gap between software engineering education and industry needs," In Proc. of IEEE 26th Conference on Software Engineering Education and Training (CSEE&T), pp-304-308, 2013. http://dx.doi.org/10.1109/cseet.2013.6595266

10.
B. Elvesæter, G. Benguria and S. Ilieva, "A comparison of the Essence 1.0 and SPEM 2.0 specifications for software engineering methods," In Proc. of the Third Workshop on Process-Based Approaches for Model- Driven Engineering, no. 2, p. 2, 2013. http://dx.doi.org/10.1145/2489833.2489835 crossref(new window)

11.
D. J. Han and H. S. Han, "Guidelines for Implementing Configuration Management in Extreme Programming based on CMMI," Journal of Internet Computing and Services, vol. 9, no. 2, pp. 107-118, 2008.

12.
K. S. Lee and T. G. Lee, "A Software Development Process of Core Instrumentation System Based on the Rational Unified Process," Journal of Internet Computing and Services, vol. 5, no. 4, pp. 95-113, 2004.

13.
S. W. Shin, H. K. Kim and S. W. Kim, "Framework for Improving Mobile Embedded Software Process," Journal of Internet Computing and Services, vol. 10, no. 5, pp. 195-209, 2009.

14.
J. Cangussu, R. DeCarlo, A. MATHUR, "Using sensitivity analysis to validate a state variable model of the software test process," IEEE Transactions on Software Engineering, vol. 29, no. 5, pp.430-443, 2003. http://dx.doi.org/10.1109/tse.2003.1199072 crossref(new window)

15.
K. H. Jin, S. M. Song, J. W. Lee and B. J. Lee, "Test Planning and Reporting for Constant Monitoring of Software R&D Projects," Korea Computer Congress, Vol. 42, No. 1, pp.597-599, 2015.

16.
S. Imoto, Y. Yoshiyuki and W. Junzo. "Fuzzy regression model of R&D project evaluation." Applied Soft Computing, vol. 8, no. 3, pp.1266-1273, 2008. http://dx.doi.org/10.1016/j.asoc.2007.02.024 crossref(new window)

17.
J. A. Kim, J. H. Kim, "Quality Assessment Framework for Medical Device specific SW R&D Project." International Journal of Software Engineering and Its Applications, vol. 8, no. 1, pp.371-376, 2014. http://dx.doi.org/10.14257/ijseia.2014.8.1.32 crossref(new window)