Now that my group and I have finished our Hale Aloha WattDepot implementation, I'm curious to see how other groups tackled on the same tasks, and to what extent of quality their products are. So I decided to take a look at the "TNT" group's Hale Aloha WattDepot implementation. In assessing the group's product, I believe that it is befitting to review them under the Three Prime Directives of Software Engineering. By doing so, I can grasp a large understanding of many components of the product, and also sharpen my knowledge of what the Three Prime Directives are and what I need to keep in mind whenever developing a piece of software, to make sure that the product is useful, easy to install, and easy to understand and enhance.
Therefore, to get started with the First Prime Directive: "does the system accomplish a useful task?", I took a look at the overall functionality of the product to see if all of the components were there. The group was able to retrieve the current power of a tower/lounge, get the energy for a tower/lounge from a specific date, get the energy usage for a tower between two given dates, and finally give a sorted order from least to most between two dates. These are all of the functions that we implemented in our own system as well, and they are useful tasks indeed.
The Second Prime Directive is: "Can an external user successfully install and use the system?"
Overall, the system does very well and is very useful. The system did not crash when providing bad inputs and it provided correct and useful outputs when providing valid commands.
Taking a look at the homepage, I found very precise information about what the system does, although to find sample input and output, I had to go into the User Guide wiki page. This page contains a sample input and output for the system along with simple instructions on running the system. In the download page, the group offers the option to download an executable jar file for those simply wishing to run the application. In addition to this, they also offer an executable jar file along with their source files so that the external users do not need to compile the system first before executing it. The distribution file contains a version number along with a timestamp for external users to monitor any changes and how long ago the changes took place.
In order to test how robust the system was, I input various commands, both valid and invalid, to see how the product handles them. Here are some of the commands I input, with a description of what was output:
· “help”
o Gives the proper output.
· “current-power Lehua”
o Gives the proper output of Lehua’s power.
· “current-power lehua”
o Since tower is case sensitive, gives a bad syntax error.
· “Current-power Lehua”
o Since command checker is also case sensitive, gives a bad command error.
· “daily-energy Lehua 2011-11-25”
o Gives the proper output of Lehua’s daily energy for Nov. 25, 2011.
· “daily-energy Lehua 2012-11-30”
o Since the date is in the future, gives a bad date error.
· “daily-energy Lehua 11-25-11”
o Since date isn't properly formatted, gives a bad syntax error.
· “daily-energy Lehua 11-25-2011”
o Also gives a bad syntax error because there is only one format for the date.
· “daily-energy Lehua 2011-10-20”
o Gives a date error showing that the date should be after November 22.
· “energy-since Lehua 2011-11-23”
o Gives the proper output of the amount of energy used in Lehua since November 23.
· “energy-since Lehua 2011-12-25”
o Gives a date error
· “rank-towers 2011-11-23 2011-11-25”
o Gives the proper output rating the towers from least to most.
· “rank-towers 2011-11-25 2011-11-23”
o Since it recognizes that the start date is after the end date, gives both a date error and a bad syntax error.
· “quit”
o Properly closes the application.
Finally, the Third Prime Directive: "Can an external developer successfully understand and enhance the system?" was reviewed, and for this, I took a look at the Developer's Guide wiki page to see if it provides sufficient instructions on building the system from its sources. TNT's Developer’s Guide wiki page does indeed provide clear instructions on how to build the system once the sources are downloaded. Quality assurance is done automatically by running ant –f verify.build.xml that runs checkstyle, PMD, and findBugs. The wiki instructs the user to run this command first before any sort of additional modifications to the code to ensure the original code didn’t have any errors. In terms of coding standards, the wiki page mentions the Eclipse format and provides a link to the xml document for users to use. The application wiki mentions that it follows issue driven project management and continuous integration. Continuous integration is a practice in where smaller pieces of code are verified for validation at smaller increments instead of at the end of construction. Often times, each integration is automatically verified (in our case using the “ant” tool) and assists the group find out who, when, and what broke the code if there is a build failure. Issue driven project management is a practice in which the over build is broken down into smaller issues, usually about two days worth of work. For this project, we submitted issues in Google Code and used that service to also monitor and add our issues. A link is provided to the continuous integration server to that is associated with the application. The only thing the Developer’s Guide is missing is how to construct the JavaDoc documentation. The information contained in the Developer’s Guide is very concise and doesn’t contain any useless information.
In addition to the Three Prime Directives, a number of other components must be reviewed as well in order to get a complete understanding of the quality of not only the functionality but the documentation and the process and progress of the product as well. JavaDoc, Build System, Coding Standards, as well as the Issue Driven Project Management and Continuous Integration must also be reviewed, and for TNT, the results of reviewing these components are as follows:
JavaDoc Review
I was successfully able to create a JavaDoc documentation successfully. After reading through the various class documentations, the documentation shows that the entire application is linked together by the Processor class. The package names are named appropriately where the Main class is contained in a higher-level directory, the command processor is in a subdirectory of the main directory, and the various commands are also in a subdirectory of the main directory.
Build System Review
Using the ant command “build.xml”, I was successfully able to build the system without any errors. The author(s) of each piece of the code are listed in the source code to allow external users to know who wrote each piece of code. In terms of coverage, after running JaCoCo, the project doesn’t have 100% coverage. In some cases such as the Main, Processor, and InvalidArgumentException classes, the coverage is 0% mainly because test cases were not made for those classes. However, the sub-100% coverage is due to the fact that none of the test cases contain a case where the author asserts an invalid statement. The current set of test cases successfully tests the application pretty well for the valid inputs. However, because of the lack of test cases for invalid inputs, there is a slight chance that an external user may create an enhancement that calls throws some random exception and the external developer may not know why it is being called.
Coding Standards Review
What I found to be really nice is that each of the command classes were formatted in the same way (because of the interface class). What I found to be very surprising at the same time was hardly any private methods were used. But this is okay since WattDepot provides a lot of the methods already. The only files I had small problems reading was the Main class and the CommandManager class. In the main class the only thing I had issues with was figuring out when the IOException would be thrown. Documentation on what would have thrown it would have been nice. The CommandManager class was beautifully written using some advanced Java techniques that I have not come across before and because of this I had some problems reading and had to take some time to do a bit of research on some of the things. This is was not a problem caused by the author.
Issues Page Review
The issues page associated with this project makes it very clear on who worked on which part of the system. It seems that a different person worked on different classes in the system and because of this, if an external developer had any questions about the system, it would be very easy for them to contact the author(s) that worked on that piece of the system. In the group of three, it seems that two of the developers did more of the work. The other group member worked on a single command class, its test case, the Main class, and the Help class. Most of these classes were fairly short and it is clearly shown in the amount of issues posted by that author. However the overall quality of these classes is great, and so it is clear that this member worked hard as well but maybe just didn't have the experience levels of the other two.
Continuous Integration Server Review
The Jenkins continuous integration server shows the progress of failed and good builds. Other than at the very beginning of the project between build three and four where the time it took to fix the project was about a day and a half, all failed builds were fixed within an hour. Most were even fixed in as little as 10-15 minutes. As for commits that were associated with issues, less than nine out of ten of them were related to an existing issue. However, the group was pretty close to having nine out of ten commits be associated with issues with having about eight out of ten.
Conclusion
From this extensive review of TNT's Hale Aloha command line interface system, it is safe to say that the Three Prime Directives of Software Engineering were fulfilled. The product can successfully be developed and enhanced by new programmers, and everything was well organized and presented. This group definitely had some features and functionality that was more advanced than my own group's ones, but that is a good thing, because I can learn from this and apply my newly realized techniques to other projects in the future. TNT's product was extremely robust and very impressive, and I see it as sort of a model for me to remember and refer to when implementing similar desired features. Although this review was very lengthy and time consuming, I learned a lot, and felt that the TNT group did an excellent job working together and putting together a solid product.
No comments:
Post a Comment