Tom, the founder of Arcon Research, was having problems with seasonality:
- His business still had to pay for his VA in the less busy seasons (even though she was barely used)
- In the busy seasons, sometimes his VA would get overwhelmed with the workload, which would limit his earning potential at critical times
- Sometimes his VA would quit between seasons, which meant he had to re-train someone new on the same process again and again.
Arcon Research got a confirmation email for each project they won (content writing). They had a VA complete their onboarding process, which was as follows:
- Update their project tracking Google sheet with info from the email (project name, client name, due date etc)
- Cross-reference the marketplace they were winning projects from to find the price for the project
- Copy/paste some of this info into an email, and find a (contracted) writer on the team that may be able to take on the project
The Onboarding Steps We Automated:
1. Update a Google Sheet (Project Tracking List) & Generate Project Files
First, we created an automation that looked for new emails, parsed the content of these emails, and automatically added a new row to his project tracking spreadsheet.
Next, we also automatically created a draft email with the body and subject populated with the right info so that Tom could enter the email address of a writer he wanted to assign the project to, modify a few internal details (like the deadline time for the writer) and hit “send.”
We also had the automation do tedious tasks like:
- Create a Google Drive folder structure for the project and share it with the right people
- Upload project files from the customer to the Google Drive folder
- Calculate the price and profit margins for the project
2. Calculate the Optimal Internal Deadline
But, Tom’s team still had to set the writer’s deadline manually. We realized this could be automatically calculated based on a few rules:
- The project needed to be checked after the writer had completed it, so we started with a 24-hour buffer (which was what his project managers were doing manually already)
- But this calculation wasn’t always the best idea (i.e., if the project was due in 48h, telling the writer to be done in 24 hours would lead to rushed work).
- So, we modified the calculation to use 70% of the deadline time as the writer time
That way, if the project had a tighter deadline, we would give the writer less time, and if it had a longer timeline, the writer would have more time, but Tom would also have more time to check it over.
This was later modified to use a maximum time of 3 days for the writer, as Tom suspected that providing the writers much more time than that led to procrastination (and if there was a surge in demand later, the writers would be unprepared).
Months later, another problem became apparent.
Occasionally, a project was on a tight timeline AND due early in the morning (6 am), but the writer deadline was very late as well (2-3 am), which meant that someone had to stay up very late to check over the paper.
To allow for better sleep, the writer deadline calculation was further modified so that it always fell within the daytime.
3. Automatically Find the Right Freelancer
Tom’s business grew to the point where he had project managers distributing work to his writers. They’d look at the type of project, and based on which writers were available and had done similar work before, they would send the project to the writer (who could then either confirm or deny the project).
The writers that confirmed the most projects (ie: most responsive), completed projects with high quality and on time (ie: highest output), and were knowledgeable about the topic of the project, were most likely to get assigned the project.
The problem, I suspected, was that as Tom’s team grew (from 5, to upwards of 80 writers), the accuracy and time to assign the project to the optimal writer would be significant. So, we developed an algorithm to suggest a writer automatically. It took into consideration:
- the workload of the project (size in pages/words and deadline)
- the historical output of the writer (if they historically produced many pages per day, on time, with high quality, we estimated their capacity to be high…but if they tended to deny projects, submit things late, or mess up quality, we assumed they were at capacity)
- the real-time workload of each writer (their real-time workload was compared to their estimated capacity to predict if they could take on the workload of a particular project or not)
- who was on holiday or sick
- each writer’s average amount of time to confirm a project, and how often they denied a project (ie: how responsive they were)
- the topic of the project, and who had said they were an expert in topics like this
- the historical quality of each writer’s work (we also created an algorithm to do automated quality assurance)
- the experience level of each writer (a new writer would only be assigned shorter projects with an extended deadline, until their “level” was changed manually, and when a writer was brand new, we preferred to send them short projects to make sure they had enough training time)
At first (as you’d expect), the algorithm still made mistakes, and Tom’s project managers let us know when they would have made a different choice. But after a few weeks, the “AutoSuggest” algorithm was far more likely to suggest a writer that accepted the project than Tom’s human project managers (not to mention, we saved them considerable time on guessing the right writer, and since the writer was more likely to accept the project, the project could be completed faster as well).
4. Make quality errors redundant (hosted templates)
One of the common errors that our automated quality checker found was that some writers missed the fundamental requirements of the assignment (citation style, spacing style, and so on). While the writers had templates for each of these styles on their desktop (that they were trained on how to use), it was clear that the writers were accidentally using the wrong template from time to time)
As a solution, Hildebrandt Automation suggested automatically choosing the right template, and including a link to the template in the project email (in order to reduce human error).
To do this, we created a “templates” table in Airtable, and put a link to each project template in the table (that way, our onboarding automation could see the project type, and automatically select the right template, then send it to the writer).
For example, this might be a template for a double-spaced blog post project:
If this document was your project template (for a content agency), you could get the “share” link (which is “https://docs.google.com/document/d/146zApm56QvumxIWbfHorB7Xddtgvu1ywFlZTaPed3Kc/edit?usp=sharing”).
By replacing the last part of the URL with /copy (“https://docs.google.com/document/d/146zApm56QvumxIWbfHorB7Xddtgvu1ywFlZTaPed3Kc/copy” instead of “/edit…”), anyone who clicks on that link would automatically be prompted to make a copy of your project template doc.
This allows you to keep your templates always up to date, and always know that your team is using the latest template versions (which makes many quality issues completely redundant).
Want a Fully-Automated Onboarding Process?
Here’s some examples of what we can automate:
- Calculating internal deadlines (just enough time for your team to complete the work, due at the right time, but not too much time that they’ll procrastinate)
- Updating your project management software (Google Sheets, Trello, Asana, Monday.com etc)
- Creating folder structures with custom names for the cleint/project
- Sending the client confirmation emails or contracts (with fields filled out for the project)
- Deciding who the best people are to send the project to
And, we can do this all, triggered instantly when your client pays via your payment processor (Stripe, etc) or fills out your onboarding form.