You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
BatAI application lives on NABat servers within their own AWS infrastructure
A user can launch the BatAI application and it appears as a different page. NABat passes the authentication information (JWT) to NABat as well as the current file/project/waveform being looked at.
BatAI uses this token and the NABat graphQL API to retrieve information about the waveform. This includes metadata like the Guanometadata, as well as a presigned S3 URL for the waveform itself.
BatAI then takes the preseigned S3 URL and downloads the Waveform to create the spectrograms and possible run the ML inferences for the type.
The user can then view the spectrogram and make a decision about the current annotation. Once this annotation is created there is a button that allow the user to 'Push Data to NABat' which will make a graphQL call to update the Manual Annotation Id for the file.
Cloud Deployment:
Currently BatAI uses docker containers. This includes using Celery for async tasks and RabbitMQ as a message queue to create the spectrograms and run inference.
NABat utilizes AWS and lambda functions for their Async tasking.
Options for BatAI:
Convert functions to utilizing AWS Lambda - Would need to convert some of the local development to make it a bit easier for development purposes. There is an option for creating an AWS Lamba function using a container image . This method could possibly be used with boto3 in django to call the function and the function will produce the results and log them.
AWS Fargate - a serverless way to run containers. I believe this was used in the RD-WATCH deployment to AWS for celery tasks.
No description provided.
The text was updated successfully, but these errors were encountered: