In this article we will examine the process of building Dacris Benchmarks 10, from prototype to finished product, and how we used Maestro Framework 2.2 to get it done in record time.
We used Maestro Web's intuitive forms designer to design an entry input form for Dacris Benchmarks 10. The form consisted of a few checkboxes (one for each test), and a few other numeric inputs for the test duration and number of passes. We provided default values for the form fields, marked them as required, and exported the form as JSON. Next, we pasted the JSON into the form viewer (/viewer.html) and exported the form page as an HTML page - bmarkform.html.
We wanted the back-end of Dacris Benchmarks 10 to run cross-platform and potentially inside a container. This is where Maestro Web helped speed things up. We already had a REST API provided as part of Maestro Web that could launch any Maestro app. All we had to do on the back-end was create the Maestro app for Dacris Benchmarks - Bmark. We used Maestro Assembler to set up Bmark and a few child apps - one for each performance test. In total, we set up 5 sub-apps: cpu, memory, storage, ai, and internet. The sub-apps were very similar to each other, so once we set up one sub-app, we used the same wrapping instructions in the others, just with a few different State.json keys. The first sub-app was set up - cpu - and we tested it through the Maestro Web front-end. Once we were happy that it worked well, and this took 8 hours in total, we copy and pasted the 'cpu' app and created the other sub-apps. Finally, we created an HTML template to display test results in a beautiful chart and table format. To create this HTML template, we had to write a bit of JavaScript, HTML, and CSS - about 100 lines in total. This was the only manually-written code we had to write for this app. The rest was all done through Maestro Assembler or using drag and drop. In total, the back-end Bmark app ended up with about 125 lines of code, and was finished (prototype ready) in 3 days
Finishing the app involved adding branding to the Maestro Web pages - a bit of HTML+CSS to add a logo and change some styles. Also, we created a shell script to build the app for 6 different platforms and deliver it as a giant ZIP file. The shell script had to integrate Maestro Framework with Maestro Web, create the binaries (using .NET SDK), and finally zip up everything. We also used the Dockerfile provided with Maestro Web to enable creation of docker containers from the platform-specific binaries.
A project of this type would normally take us 1-2 months to develop. This is how long it took for us to develop Dacris Benchmarks 4.9 and 8.1. Dacris Benchmarks 9.0 took a bit longer, because we had to engineer brand new 3D tests. In the end, the average is about 2 months for dev time. With Maestro Framework, we were able to cut down our dev time to less than one week: 6 days in total! Granted, things are simpler in this version, but we also have new features, like an AI test, a REST API, cross-platform, and containerization. In conclusion, Maestro Framework 2.2 is mature enough for production-level app development and deployment. While we were testing the back-end, we discovered zero Maestro Framework bugs, another sign that the framework is stable.