A simple web based uploader to upload data from your existing camera trap data in the field.
We support live 4g camera traps using a solar panel with inbuilt batteries.
A platform designed to easily support the curation of millions of images with ease from our website.
When using live devices set alerts for specific species at specific times. All received via any Apple iOS or Android device.
We have helped a number of groups transition away from manual tagging entirely.
A platform that has been used commercially for over 4 years.
Support multiple users with read only or edit access to your organisational data.
Data is ownership is not transferred to us nor do we have the right to sell or provide it to any 3rd parties.
Data is processed and stored within Australia only.
Data from all our users can be used to train our models improving outcomes for all.
Happy to offer a free trial to ensure our clients know exactly what results are possible.
A platform that has analysed over 100 million images. Purpose built to manage data from larger arrays.
No. The platform is web based. If you have access to the latest version of chrome the system is useable from any modern laptop or desktop.
The system has proven to be as accurate as a human at finding cats on suitably large dataset. The actual accuracy is highly dependent on your target species and your own camera setup. This is why we believe a free trial helps our clients get a clear view on our platforms performance.
If the data has already been uploaded, we can process approximately 1 million user images in a day with ease. The upload is highly dependent on user connections to our platform. This can be evaluated during our trial.
The platform currently can seperate out empties and look for the following species:
‘horse’, ‘macropod’, ‘pig’, ‘rabbit’, ‘sheep’, ‘cat’, ‘dog’, ‘cow’, ‘fox’, ‘vehicle’, ‘person’, ‘bird’, ‘deer’, ‘goat’, ‘echidna’, ‘lizard’, ‘possum’, ‘mouse’, ‘bandicoot’, ‘wombat’, ‘quokka’, ‘rat’, ‘quoll’, ‘koala’,‘amphibian’,‘snake’,‘penguin’, ‘potoroo long-nosed’, ’emu’, ‘bush stone-curlew’, ‘domestic dog’, ‘stoat’, ‘duck’, ‘donkey’, ‘pied oystercatcher’, ‘hedgehog’, ‘crow’, ‘magpie’, ‘malleefowl’, ‘tammar wallaby’, ‘black footed rock wallaby’
Yes, the system has a larger set of Australian wildlife you can specifically set identified species as. For example we may tag a bird but you know it is a sulphur crested cockatoo. You can “relabel” this data and this will be stored on record then on. It’ll form part of data analytics and any downloaded information.
CSV. Our system outputs a simple CSV with all the relevant information along with any extra tags you have provided. This allows you to feed this into any R, Python, or Excel sheet for further processing.
Sure. We will work with your data and help you create enough training data. Where possible working with other groups with data you might need if all parties agree. This does require a little manual effort but we will support the process and give some key statistics for how achievable your goals are based on the data you have.
Yes. The platform allows you to create seperate users for the various users in your organisation. You can simply let them view data or give them edit control.
A live device is a device that can send images via 4G cellular connectivity. We currently recommend using the Swift Enduro 4G cameras for this purpose and ensure compatibility between our platform and this product.
Yes. The platform will see these both as devices. Just one of these devices can trigger alerts and data is automatically processed from field data. This allows our users to keep legacy hardware working hard.
A number of systems are in Trial. Contact us to let us know your needs and we can see if we have space to accomodate expansion. We have limited resources and trying to solve this fairly large and complex problem on a shoestring budget.
We are trialing systems running in other countries. If the demand is significant this can be arranged. Please contact us to try out our North American and New Zealand Models or we can help build new Models suitable for your Flora & Fauna.
We’re an optimistic and gritty team of Technauts, exploring the wildlife space with a technological toolkit and a down to earth approach.
Boldly going where few bother. Enabling wildlife professional’s to spend more time saving precious flora and fauna and less time doing admin.
Read on to learn more about eVorta’s story and history and see the full eVorta team.
It’s 2018 – Artificial Intelligence for self driving car’s taking off. Yet researchers are manually burning hours filtering out empty images from camera trap datasets yet alone knowing what species are actually present..
We couldn’t believe there wasn’t a more clean and simpler platform out there.
Maybe we were a little naïve, but we decided to stop waiting for promised solutions from larger established institutions and tried to make it ourselves.
No venture capitalists. No consultants. No office. No Funding. Just pure passion & hope. Eager and super broke. Upfront about it.
We don’t have anything against conventional wisdom. We just don’t always follow it. Even today – we dream big, the usual insecurities, a tonne of failures and all the other stuff that keeps us human. And this will never change.
We started eVorta on a balcony – a few machines burning through NVIDIA 1080 TI cards; generating our first models.
No time for a website. Just a few first clients that put faith in us. No defined market place. Nothing to compare ourselves to.
Hey, we’re not marketers, but we held our breath and did it. Still learning and adapting every day as new challenges surface.
It’s 2024 – We haven’t looked back since…
A little news about our work.
We encourage any external parties to reach out to us in order to compare our product with other solutions.
“The use of eVorta to classify images saved $0.27 m (10%), while the use of 4G-connected cameras combined with eVorta yielded cost savings of $2.15 m (81%) over traditional camera traps requiring manual image download and processing“
“Humans missed 11 cat sequences while eVorta missed only one cat sequence..“
Download Full Document Here.
[James Smith, Ashleigh Wycherley, Josh Mulvaney, Nathan Lennane, Emily Reynolds, Cheryl-Ann Monks, Tom Evans, Trish Mooney & Bronwyn Fancourt]
Evaluation of data processing platforms for camera trap data
“The most obvious improvement from eVorta against the other reviewed classification solutions is the significant improvement in recall (89.2%) while maintaining high precision (92.2%). More importantly, it captured 97.7% of cat events that occurred when filtering the dataset with the high confidence setting (0.99).“
Download Full Document Here. [Now newer models exist and our platform supports live processing of uploaded data]
[E. Oyston and J. Tinnemans. August 2022. Department of Conservation]
Using the eVorta platform to develop a machine learning automated classification/detection model for stoats in the New Zealand environmental context: A preliminary trial.
“At 0.5 confidence, ‘Bestla’ picked up one more stoat event than manual classification did“
Download Full Document Here.
[Em Oyston, May 2022 Pure Salt Ltd NZ]
A technologist that previously worked in defence and finance. Leveraging high performance computation to build the eVorta stack from the ground up.
A lover of the Australian bush and its critters, and keen to be a part of the restoration of our native species. Worked extensively in the feral animal field throughout the Queensland rangelands.
We would love to hear from you. Feel free to reach out using this form.
© All Rights Reserved.