John
The capstone project
This project was created by team Placeholder consisting of James Dang, Jaydon Tse, Edward Gauld, Jake Edwards and Myself.
Why was this created?
Task Master was created through the undertaking of COMP3900, the capstone project course. The idea happened to be one of the many selections that were available, however as a team, we decided to select this particular idea for a number of reasons:
- Our familiarity with the problem of managing day to day tasks
- A few of us had already used competing apps within the space (Todoist, google tasks, etc...)
- Our already established ideas for possible implementation and additions to the problem's scope (ones that I will mention later)
After our first few meetings, it was decided that due to our uncreativity (our team's name is "placeholder"...), the app would be named "Task Master", exact to what was provided in the project's specification.
The project itself proposed to solve numerous amount of issues that included allowing
- Large scaling teams to assign and manage tasks within the workplace
- The estimation of busyness (yes that is a word) based on assigned tasks
- A network management system that allows for users to add and remove individuals that are currently within their workplaces
Provided 10 weeks, we were to fully build the system, walk through our implementation and deliver on a number of reflections and reports during the process. We were also required to include our own additions to increase the novelty of the product with us introducing two new feature requirements;
- A chat messaging system to allow users to directly communicate without the need for an external platform
- A machine learning model that would allow for the generation of busyness values based on past user data (periodically fed).
How it was built
The frontend
The frontend in its entirety was built by our frontend team (James Dang and Jake Edwards).
Created with React with Redux state management and styled by styled-components, every effort was placed into ensuring that the platform could be built as soon and as quickly as possible.
However, during the time that the initial backend endpoints were developed, the team completed the initial mockup for Task Master as well as the prototype to work as the base for the initial implementation.
You can find the Figma workspace here as well as the working prototype here.
Alongside the requirement for building the frontend, the addition of the chat messaging feature meant that a messaging system had to be built in order for the user to transmit messages to other users. In this case, a real-time messaging server was built using socket.io that was run concurrently with the frontend and backend to act as middleware in enabling the transmission and restoration of messages between users.
The backend
The backend in its entirety was built by our backend team (Edward Gauld, Jaydon Tse and Myself).
With the attitude of efficiency being king, the backend team decided on the use of Django and its rest framework on the main backend technology used to implement the features needed. This was selected due to the high modularity of packages that accompany Django which allowed for shorter development times, increasing the efficiency of the backend development process. There were also a number of open-source packages used in conjunction with Django within the backend itself including:
- Django-friendship
- Django-apscheduler
- Django-rest-passwordreset
- Pytorch
As well as the integrations of the API endpoints required for the function of Task Master, the additional requirement of Machine Learning generated Busyness estimates required for the production of an entirely new sub-project within the Task Master scope. This project was named the "Busyness Delta Project", with the creation of the Machine Learning Model handled by Jaydon Tse, and integration with the backend done by myself.
You can find the specifications of the Delta Project here.
Alongside the Machine Learning integration, there were functions built to enable the model to self-update and become self-sufficient with the use of user data anonymously utilised to sharpen the accuracy of the model. This model, as well as feedback forms specific to users, were regenerated/reset every Saturday via Django-apscheduler. The code that initiates this is below in case anyone wants to do something similar.
# Periotic update busyness model
# Run every sat @ 12AM based on latest data
def updateBusynessModelSchedule():
scheduler = BackgroundScheduler()
scheduler.add_jobstore(DjangoJobStore(), "default")
# run this job sat @ 12AM
scheduler.add_job(
updateBusynessModel,
trigger=CronTrigger(
day_of_week="sat", hour="00", minute="00"
), # Midnight on Saturday.
id="update_busyness_model",
max_instances=1,
replace_existing=True,
)
register_events(scheduler)
isExistTable = False
# make sure that table exists before running the scheduler
try:
from django.db import connection
cursor = connection.cursor()
cursor.execute('SELECT * FROM django_apscheduler_djangojob')
cursor.close()
isExistTable =True
except Exception as e:
print(e)
# run if exists
if isExistTable:
try:
print("Busyness interval update started...", file=sys.stdout)
scheduler.start()
except KeyboardInterrupt:
print("Busyness interval shutdown successful", file=sys.stdout)
scheduler.shutdown()
The Challenges
As with all projects challenges came along the way that had to be solved. The only difference, in this case, is that we documented them this time around.
Some of these documentations have been included below and taken from the final report. You can read the entire report here.
Real-time Messaging (Socket.io)
Initially, we had planned to poll the servers constantly to retrieve and send messages that would give the illusion of real-time. However, this would have heavily congested the server, even at a small scale, and could potentially show messages out of order.
For its first implementation, we planned to use the Django channels package to implement messaging. It would also make sense to integrate messaging with the backend code as it would centralise components together. However, the package was unintuitive when it came to building the private messaging service, as it required many workarounds to get the foundations working. Also, the feature of showing users being online/offline also seemed too big of a challenge, which almost scrapped the idea of integrating real-time messaging.
Instead, we experimented with Socket.io, which would not be integrated with the Django server and instead act as its own standalone Node.js server. While it seemed sceptical to implement it this way, since it decentralises code, even more, it turned out to be the more optimal solution.
The Socket.io server was able to create socket ids that were uniquely assigned to an individual web browser. These socket ids allowed messages to be redirected to the correct users, which showed in real-time as well. Socket.io also allowed the broadcasting of currently online users, which was how we were able to implement users being online/offline.
Busyness estimates and the acquisition of initial data
The lack of realistic data reduced the capability and accuracy of the predictions that were made from the model. As the generation of the model depended upon large amounts of data to provide a higher level of accuracy. However, due to the unavailability of accurate task data within our project’s ecosystem, the random generation of data was utilised in training the machine learning model. This meant that the generated busyness value was not accurate and resulted in irregular behaviour such as busyness being unresponsive to large changes in user workload. Though this issue is mitigated following the induction of user-generated data, it still poses an issue during development and prototype testing/verification.
The result
Through thick and thin, our team completed the project in its entirety receiving a High Distinction mark for our efforts. The project is currently not hosted anywhere but you can take a look at the source code here, alongside any of the other uni work that I have chosen to open source.
The final demonstration product was run with Linux via a VM. All instructions are located within the repository's readme.
Ultimately I believe that this project has taught all of us a little of everything when it comes to software development. From strengthing my own personal time management to team coordination skills and even how to write code better, the project has been an invaluable experience that will forever be on my transcript. 🎓