Skip to content

Web Robots Are Dumb But Still Smarter Than You

I have a love-hate relationship with web robots.

When programmed correctly, these robots save me hours and hours of time doing simple web tasks…such as copying and pasting data into a spreadsheet.

However, when not programmed correctly, they can literally COST YOU hours and hours of your time.

For example,

I created a simple web scrapper that copy and pastes data from a vendor website into a spreadsheet for importing.

It basically looks up key data I need, filters it, removes extra styling tags and images, and pastes the raw data into a spreadsheet.

When this robot is setup properly, I can populate a spreadsheet with 500 products in a matter of minutes. These products can then be directly uploaded to a website and be live and ready to go in minutes.

However, these bots didn’t always work this well for me.

In the beginning, this scrapper actually cost me more time cleaning up data than it did actually saved.

The web robot would scrape data from the vendors site and populate its spreadsheet (as it was designed to do).

However, after 10 minutes of running I would check the data it had gathered and noticed the data was full of random html tags, empty info, and had formatting issues that had to be cleaned up.

This scrapper gathered most of the information I needed, but the formatting was all over the place.

Data wasn’t in its right cells, descriptions were taking up multiple cells, and some categories were missing.

It took me hours to clean up this sheet and prep it for import.

I think I could have saved more time by doing the whole job myself manually.

But either way, I respect the bot for doing exactly as it was told. I just wish it would pause and ask me to fix errors before continuing to make the same error AGAIN and AGAIN and AGAIN and AGAIN.

Published inWorkflow

Comments are closed.