Categories
Emerging Tech Innovation IoT People & Purpose

How we gave back 2 days of work per week to our administration officer with MS Power Automate

How we gave back 2 days of work per week to our administration officer with MS Power Automate

Valerie Sandford

Valerie Sandford

Project Manager
ARQ Group

How we gave back 2 days of work per week to our administration officer with MS Power Automate

At ARQ, we are aware of the negative impact that repetitive tasks can have on our employees. It affects their motivation and mental wellbeing. That’s why we strive to reduce repetitive manual processes wherever possible. 

 

We decided to experiment with Microsoft Power Automate to improve our internal processes, and timesheet reminders appeared to be a perfect use case for us. In this post we will explain how we automated our timesheet reminders with Microsoft Power Automate.  

 

As in many companies, employees must submit their timesheets at the end of each week. For us, a timesheet records the number of hours worked on each project, client, and any leave taken. But a number of us forget to submit our timesheets, and our administration officer sends hundreds of Slack, Microsoft Teams, and emails messages each week to employees and their managers to remind them to submit their timesheets. Our objective was to automate these reminders with Microsoft Power Automate and save valuable time for our administration officer and employees.  

 

The first step was to read a Salesforce report that lists all missing timesheets. It appears that it was easier to read data from Sharepoint than directly from Salesforce, so we automatically exported the report into Sharepoint instead. 

 

To do this, we subscribed to this report twice a week from Salesforce. This allowed us to receive an email containing CSV data regularly during the week. Microsoft Power Automate’s integration with Outlook allowed us to read these emails and copy any attachments into Sharepoint. We could then access any Sharepoint file using the site address and file path: 

The next step was to read each line of the CSV report. The reason we chose to export the report into CSV instead of Excel, was that Power Automate requires the Excel file to have data organized into Tables. We tried to create some tables via Power Automate but we faced other issues (the file got blocked or we couldn’t retrieve the last updated data). However, reading a  CSV file required writing a small amount of code. This step could be daunting for users who are non-developers, which Power Automate clearly targets. Better support for this use-case in the future would help keep Power Automate apps as low-code as possible. 

The third step was to send missing timesheet reminders to the employee. For each missing timecard we were able to send email, Slack, or Microsoft Teams reminders. In order to send Slack messages, we needed to ensure that a slack username format (`firstname.lastname`) was consistently applied across the company. 

The last and final step was to send reminders to an employee’s manager. If your Office365 employee profiles are up to date, Power Automate allows you to retrieve the user profile which includes details of their manager.

Although, we faced a few challenges along the way, this experiment with Microsoft Power Automate successfully allowed us to automate timesheet reminders across the company. Each week more than 200 emails, Slack and Microsoft Teams reminders are sent automatically to employees and managers. It gave back 2 days of works per week to our administration officer who can now focus on what really brings value to the company.

We enjoyed Power Automate’s tight integration with Office365 user profiles, Outlook, Sharepoint, and Microsoft Teams. It also helped highlight areas across the company where we needed to ensure our data was up to date. 

 

If you wish to know more about how we automated this process using Microsoft Power Automate, feel free to contact us at emerging.technologies@arq.group

Share this on Facebook
Tweet this
Share this on Linkedin

More to explore

Categories
Emerging Tech Innovation IoT People & Purpose

Digital Twins are here, we’ll tell you what they are!

Digital Twins are here, we’ll tell you what they are!

Jaysen McGuiness

Jaysen McGuiness

Platform Engineer at ARQ Group

Digital Twins are here, we’ll tell you what they are!

The pace of change and accelerated arrival of Industry 4.0 has forced many areas of business and society in general to forcibly (and quickly) adjust to this new digital way of working. COVID takes some responsibility, but the movement was already underway.

‘Digital Twins’ as a buzz-word is being used across all industries, so this brief introduction to how we see it being used now and into the future will help explain what it really means, and how it can add tangible business value to you.

Digital Twins as a term was most notably used by NASA in the early 2000’s as “a virtual representation in virtual space of a physical structure in real space and the information flow between the two that keeps the former synchronized with the latter” when designing future rockets, before so much as a screwdriver was picked up to design and model how various systems would best work with each other.  

The most simplistic explanation for a Digital Twin is that it can allow you to have a dashboard-level view of your plant, process or operation wherever you are, as a digital overlay of those components. More than just remote camera or sensor monitoring, a Digital Twin becomes an engaged agent, following rules you set to look out for issues even before they happen, to continually learn from and advise on your whole connected ecosystem, and (potentially) actively intervene to avoid dangerous and/or costly issues from occurring.  Further, that intelligence can be used to model ‘what-if’ scenarios to help build better, more economically viable, safer and environmentally cleaner processes.

While still evolving, in more recent implementations it is being used at very small scale levels of cellular analysis by biochemists right through to city and country-wide scales where everything from water and power utilities are using man-made structures and naturally occurring features are being modelled and tracked with IoT sensors, along with what is becoming the most typical type of implementation in manufacturing process monitoring or facilities management.  This type of ‘smart’ integration was the first step into developing a Digital Twin, and when combined with the often significant amount of historical and streamed data that has become so prevalent in our modern world, Digital Twins is the collection and aggregation point to begin to make sense of that ocean of data and real-world telemetry that is now at our disposal, and begin to make it work for us. 

Data on its own is, however, ultimately useless if it is not used, and the art of knowing what is relevant is where the true value of Digital Twins lies. It is more than just having the set of every sensor out there, it’s knowing which ones are providing that useful insights or giving you that critical early warning.  

 

 

The anatomy of a Digital Twin requires consideration of the following points:

  • Understand the business questions/values that are being answered
  • Determine what existing sensors/data is in place, and identify any more that may be required
  • Examine all levels of the business to understand methods of engagement
  • Scope definition to be kept small for first-time implementation
  • Work to deliver tangible business value as early as possible once implemented

There is a catch: attempting to build a large number of complex processes and capture points can take a very long time. While keeping that end-state in mind, the first approach should be to build a thin sliver that has a discrete, demonstrable positive impact and show value, and integrated into existing work practices and environments without significant disruption. This limiting is what will help provide the relevant template for your business or industry and help move the Digital Twin concept from being one that is heavily bespoke and specific to its implementation, to become a question of patterns that can be re-used, the more examples are realised. This will have the effect of commoditising the approach, making it more common while democratising the intelligence and deep knowledge that continues to evolve as your business does.

For most cases, this initial implementation is a 3-6 month window where an outcome can be proven, at a cost around $50k. Once this value is clear, the options and possibilities for further enhancement and prioritising them as relevant becomes clearer, and allows for taking advantage of any new advancements in what is a highly dynamic and changeable digital space.

James Litjens is the Director of Emerging Technologies at ARQ Group. When James isn't leveraging tech for clients or delving into what's hot, he's building his own mobile apps, competing in triathlons and playing the drums in his apartment (at 1 am). Ever-so-considerate, James wears headphones when playing his electric drums. James' real drum kit is stored in a secret location with no neighbours. You can reach James at: james.litjens@arq.group

Share this on Facebook
Tweet this
Share this on Linkedin

More to explore