Tuesday, December 13, 2011

Doing More Cool Things with WattDepot



Alright, since it’s probably taken all the way from the last time until now to finish reading straight through the last post (the technical review of TNT’s Hale Aloha Command Line Interface), I suppose it’s time to tell you about the next steps my team and I have taken with this project. Keeping the original code base, we implemented further functionality on the product, to see if it could be done and with what effort and challenges. Our goal was to add three basic functions: set-baseline, monitor-power, and monitor-goal. Monitor-power outputs the current power readings of a tower or lounge every user-defined number of seconds until a character is entered, which terminates the task. Monitor-goal takes a baseline (defined by set-baseline) as well as a user-defined percentage reduction goal and calculates whether or not the tower or lounge is meeting that goal or not, and like monitor-power, outputs this every so many seconds as defined by the user.

Although there were some issues with implementing this such as getting the timer to work, especially with terminating on any character being pressed, majority of the issues arose from the fact that the bulk of the code— and it being the base, too— was built by other developers. Fortunately for us, the code base was built very well, with many advanced and well-implemented techniques; much more robust and easy to use (in the long run) than our own one was built. However, because some of the implementations went over our heads, it was difficult to understand it and get used to it in the beginning. But I don’t think that this was due to a lack of keeping with Prime Directive number 3 (an external develop being able to understand and install the system), because it was not due to poor documentation or unreadable code, or anything similar to that. It just took a little bit of time to understand, which is probably just normal with taking over someone else’s code. As mentioned in my technical review of TNT's product, the Three Prime Directives of Software Engineering were met, because the product definitely accomplishes something useful, the documentation provides clear instructions to installing and using the system, and an external developer can easily (or at least relatively easily) develop and enhance the system.

As far as the Issue Driven Project Management went, just as it was with our own Hale Aloha Command Line Interface, everything was rather smooth-sailing and we didn’t run into any issues. Again, it was great in the beginning for portioning out the work load, but as we had to make modifications to our code, it became less of a required thing and a little more of a nuisance. This time it wasn’t as bad though, because the command interface and processors were all created already, so we only had to implement the functions (and their associated test classes) themselves.

Most of our enhancements to TNT’s Hale Aloha CLI work well, and I’m actually quite surprised at what we were able to accomplish, given the little amount of time we had to spend on it, and the complexity of the tasks. One thing that doesn’t quite work properly is the functioning of the timers implemented in monitor-power and monitor-goal. It took a while just to get the timer properly implemented, and we could not get the termination of the loop to end properly. We would have liked for it to end when the user types in any character, but instead it only does so upon entering a carriage return. Another defect in the system is the full checking of whether a baseline was set before the monitor-goal command is run. The current setup has it where the baseline is not tower-specific. If set-baseline is run only for Ilima, for example, the user could run monitor-goal on Lehua, and not encounter any errors. This is a minor error and could probably be fixed rather easily, had we had more time. Other than this, everything seems to be working well, and I’m very pleased at the quality of the enhancements that we were able to produce in just a couple of weeks. The test cases prove that our system is pretty robust (other than the aforementioned flaw), and it is fully documented and working.


Working with someone else's code base-- or anything for that matter-- is difficult and can be a headache-- especially in the beginning when everything is so foreign-- but this process is beneficial in many ways. The obvious one is that it provides you with examples-- good and/or bad-- of implementation techniques, and so you can learn new tricks if there are any, and fix the places that are substandard. Secondly, however, being put in an environment where you are not only requires you to work as a team but to also switch products to compare or improve them forces you to write better code. Especially when you know that others' eyes are going to be looking at the code you wrote, you want it to be clean, correct, and the best it can possibly be. This includes documentation and code formatting, which are things that are often left attention-deprived when you know that only one set of eyes will be looking at the code. Plus, there's the pride aspect of providing really good code-- perhaps implementing a cool trick that you learned that makes a certain function simple, correct, and efficient; you can show it off in a "look at what I know how to do" kind of way, and that makes things a bit more fun, rather than "it's messy, but... ah, it works so it's good enough for me." Friendly, harmless "competition"-- if I may even call it that-- never hurts anyone, and is one way to intrinsically encourage good code in a pain-free way!

Friday, December 2, 2011

TNT's Explosive Performance

Now that my group and I have finished our Hale Aloha WattDepot implementation, I'm curious to see how other groups tackled on the same tasks, and to what extent of quality their products are. So I decided to take a look at the "TNT" group's Hale Aloha WattDepot implementation. In assessing the group's product, I believe that it is befitting to review them under the Three Prime Directives of Software Engineering. By doing so, I can grasp a large understanding of many components of the product, and also sharpen my knowledge of what the Three Prime Directives are and what I need to keep in mind whenever developing a piece of software, to make sure that the product is useful, easy to install, and easy to understand and enhance.

Therefore, to get started with the First Prime Directive: "does the system accomplish a useful task?", I took a look at the overall functionality of the product to see if all of the components were there. The group was able to retrieve the current power of a tower/lounge, get the energy for a tower/lounge from a specific date, get the energy usage for a tower between two given dates, and finally give a sorted order from least to most between two dates. These are all of the functions that we implemented in our own system as well, and they are useful tasks indeed.

The Second Prime Directive is: "Can an external user successfully install and use the system?"
Taking a look at the homepage, I found very precise information about what the system does, although to find sample input and output, I had to go into the User Guide wiki page. This page contains a sample input and output for the system along with simple instructions on running the system. In the download page, the group offers the option to download an executable jar file for those simply wishing to run the application. In addition to this, they also offer an executable jar file along with their source files so that the external users do not need to compile the system first before executing it. The distribution file contains a version number along with a timestamp for external users to monitor any changes and how long ago the changes took place.
In order to test how robust the system was, I input various commands, both valid and invalid, to see how the product handles them. Here are some of the commands I input, with a description of what was output:          

·      “help”
o  Gives the proper output.
·      “current-power Lehua”
o   Gives the proper output of Lehua’s power.
·      “current-power lehua”
o  Since tower is case sensitive, gives a bad syntax error.
·      “Current-power Lehua”
o   Since command checker is also case sensitive, gives a bad command error.
·      “daily-energy Lehua 2011-11-25”
o   Gives the proper output of Lehua’s daily energy for Nov. 25, 2011.
·      “daily-energy Lehua 2012-11-30”
o   Since the date is in the future, gives a bad date error.
·      “daily-energy Lehua 11-25-11”
o   Since date isn't properly formatted, gives a bad syntax error.
·      “daily-energy Lehua 11-25-2011”
o   Also gives a bad syntax error because there is only one format for the date.
·      “daily-energy Lehua 2011-10-20”
o   Gives a date error showing that the date should be after November 22.
·      “energy-since Lehua 2011-11-23”
o   Gives the proper output of the amount of energy used in Lehua since November 23.
·      “energy-since Lehua 2011-12-25”
o   Gives a date error
·      “rank-towers 2011-11-23 2011-11-25”
o   Gives the proper output rating the towers from least to most.
·      “rank-towers 2011-11-25 2011-11-23”
o   Since it recognizes that the start date is after the end date, gives both a date error and a bad syntax error.
·      “quit”
o   Properly closes the application.

Overall, the system does very well and is very useful. The system did not crash when providing bad inputs and it provided correct and useful outputs when providing valid commands.
 
Finally, the Third Prime Directive: "Can an external developer successfully understand and enhance the system?" was reviewed, and for this, I took a look at the Developer's Guide wiki page to see if it provides sufficient instructions on building the system from its sources. TNT's Developer’s Guide wiki page does indeed provide clear instructions on how to build the system once the sources are downloaded. Quality assurance is done automatically by running ant –f verify.build.xml that runs checkstyle, PMD, and findBugs. The wiki instructs the user to run this command first before any sort of additional modifications to the code to ensure the original code didn’t have any errors. In terms of coding standards, the wiki page mentions the Eclipse format and provides a link to the xml document for users to use. The application wiki mentions that it follows issue driven project management and continuous integration. Continuous integration is a practice in where smaller pieces of code are verified for validation at smaller increments instead of at the end of construction. Often times, each integration is automatically verified (in our case using the “ant” tool) and assists the group find out who, when, and what broke the code if there is a build failure. Issue driven project management is a practice in which the over build is broken down into smaller issues, usually about two days worth of work. For this project, we submitted issues in Google Code and used that service to also monitor and add our issues. A link is provided to the continuous integration server to that is associated with the application. The only thing the Developer’s Guide is missing is how to construct the JavaDoc documentation. The information contained in the Developer’s Guide is very concise and doesn’t contain any useless information.
 
 In addition to the Three Prime Directives, a number of other components must be reviewed as well in order to get a complete understanding of the quality of not only the functionality but the documentation and the process and progress of the product as well. JavaDoc, Build System, Coding Standards, as well as the Issue Driven Project Management and Continuous Integration must also be reviewed, and for TNT, the results of reviewing these components are as follows:

JavaDoc Review
            I was successfully able to create a JavaDoc documentation successfully. After reading through the various class documentations, the documentation shows that the entire application is linked together by the Processor class. The package names are named appropriately where the Main class is contained in a higher-level directory, the command processor is in a subdirectory of the main directory, and the various commands are also in a subdirectory of the main directory.

Build System Review
            Using the ant command “build.xml”, I was successfully able to build the system without any errors. The author(s) of each piece of the code are listed in the source code to allow external users to know who wrote each piece of code. In terms of coverage, after running JaCoCo, the project doesn’t have 100% coverage.  In some cases such as the Main, Processor, and InvalidArgumentException classes, the coverage is 0% mainly because test cases were not made for those classes. However, the sub-100% coverage is due to the fact that none of the test cases contain a case where the author asserts an invalid statement. The current set of test cases successfully tests the application pretty well for the valid inputs. However, because of the lack of test cases for invalid inputs, there is a slight chance that an external user may create an enhancement that calls throws some random exception and the external developer may not know why it is being called.

Coding Standards Review
            What I found to be really nice is that each of the command classes were formatted in the same way (because of the interface class). What I found to be very surprising at the same time was hardly any private methods were used. But this is okay since WattDepot provides a lot of the methods already. The only files I had small problems reading was the Main class and the CommandManager class. In the main class the only thing I had issues with was figuring out when the IOException would be thrown. Documentation on what would have thrown it would have been nice. The CommandManager class was beautifully written using some advanced Java techniques that I have not come across before and because of this I had some problems reading and had to take some time to do a bit of research on some of the things. This is was not a problem caused by the author.

Issues Page Review
            The issues page associated with this project makes it very clear on who worked on which part of the system. It seems that a different person worked on different classes in the system and because of this, if an external developer had any questions about the system, it would be very easy for them to contact the author(s) that worked on that piece of the system. In the group of three, it seems that two of the developers did more of the work. The other group member worked on a single command class, its test case, the Main class, and the Help class. Most of these classes were fairly short and it is clearly shown in the amount of issues posted by that author. However the overall quality of these classes is great, and so it is clear that this member worked hard as well but maybe just didn't have the experience levels of the other two.

Continuous Integration Server Review
            The Jenkins continuous integration server shows the progress of failed and good builds. Other than at the very beginning of the project between build three and four where the time it took to fix the project was about a day and a half, all failed builds were fixed within an hour. Most were even fixed in as little as 10-15 minutes. As for commits that were associated with issues, less than nine out of ten of them were related to an existing issue.  However, the group was pretty close to having nine out of ten commits be associated with issues with having about eight out of ten.
 
Conclusion
            From this extensive review of TNT's Hale Aloha command line interface system, it is safe to say that the Three Prime Directives of Software Engineering were fulfilled. The product can successfully be developed and enhanced by new programmers, and everything was well organized and presented. This group definitely had some features and functionality that was more advanced than my own group's ones, but that is a good thing, because I can learn from this and apply my newly realized techniques to other projects in the future. TNT's product was extremely robust and very impressive, and I see it as sort of a model for me to remember and refer to when implementing similar desired features. Although this review was very lengthy and time consuming, I learned a lot, and felt that the TNT group did an excellent job working together and putting together a solid product.