LogoLogo
  • TJUAV Documentation
  • Documentation
    • What is TJUAV?
      • Subsystems
      • Competition Details (2021)
    • How to Use Gitbook
    • Table of Contents
  • RC Guide
    • Introduction
    • Aerodynamics & Flight
      • Forces of Flight
    • RC Electronics
      • Comms
      • Power / Propulsion System
        • Batteries / Battery Chargers
        • ESCs
        • Motors
    • Control Surfaces
    • Propellers
      • Function
      • CW and CCW Propellers
      • Thrust Table
    • Flight Simulations
    • Tools
      • Laser Cutter
      • 3D Printer
    • Getting Certified
  • Software
    • Programming
      • Git
      • Python
      • JS
      • VSCode
      • Mission Planner
    • Mechanical
      • Fusion360
        • Installation
        • Fusion Teams
        • Sketches
        • Sketch Tools
        • Parameters
        • Timeline
        • Solid Tools
        • Components
        • Joints & Assemblies
        • Add-Ins
        • Good Practice
      • AutoCAD
      • Cura
        • Initial Setup
        • Profiles
        • Quality
        • Shell
        • Infill
        • Material
        • Speed
        • Travel
        • Cooling
        • Supports
        • Adhesion
        • Experimental
    • Website
      • Code Documentation
      • Heroku Usage
      • AWS Usage
      • GitHub Pages
  • Hardware
    • Computers
    • Radios
      • RFD900x
      • Ubiquiti Bullet and Powerbeam M2
    • Cameras
      • Gphoto2
      • Sony α5000/α5100
      • See3Cam_CU135
      • Arducam 4
    • Power
  • Mechanical Progress
    • Airframes
      • Razgriz
      • Hyperion
      • Testing Plane
      • Avalon
        • Avalon Mk.1
        • Avalon Mk.2
    • UGVs
      • Electrical System
      • Drop Mechanism
      • Speed Car Super Speed
      • SPARTA
  • Programming Progress
    • Computer Vision
      • Preprocessing Techniques
      • Map Stitching
        • SIFT
      • Detection / Classification
        • Canny / Contours
        • Blob Detection
        • KMeans
        • Mean Shift Filter
        • RotNet
    • Autopilot
      • A*
      • RRT*
      • Genetic Algs
      • Spline Navigation
  • Master Code
    • GroundStation
      • Frontend
      • Backend
    • Computer Vision
      • Image Capturing
      • Map Stitching
      • Detection
      • Classification
    • Autopilot
    • Comms
      • Image Compression
      • Packet Format
Powered by GitBook
On this page
Edit on GitHub
Export as PDF
  1. Master Code
  2. GroundStation

Backend

Endpoint docs for GS backend

Intro

The backend for the Ground Station code is a Flask server written in Python. Run main.py to start the server.

Reference

GET - HTTP GET request

POST - HTTP POST request

Documenting format:

/endpoint_group/

  • Description of general purpose

/endpoint_group/endpoint_name

  • Purpose

  • Description of how backend uses it

  • Description of how frontend uses it

/mav/

  • Endpoints for communicating with the pixhawk

/mav/quick

  • Current telemetry in a dictionary

  • Backend requests pixhawk for telemetry

  • Frontend GETs the data and displays it on the Quick subtab in the Flight Data tab

/mav/all

  • All pixhawk related data in a dictionary

  • Backend requests pixhawk for data

  • Frontend GETs the data and displays it on the All subtab in the Flight Data tab

/mav/actions

  • Dictionary of actions to be uploaded to the pixhawk via pymavlink

  • Backend will receive actions and send them to the pixhawk

  • Frontend will POST the data based on buttons that user clicks

/interop/

  • The only direct contact between the interop server and our entire ground station is through the backend server. Therefore, all actions concerning interop must go through the following endpoints

/interop/login/

  • Logging into interop

  • Sending a GET to this endpoint will force the interop server to retry login if not already logged in. Also returns the login status.

/interop/get/<key>

  • Returns the requested data (which is the variable key)

    • Current list of requestable data: mission id, waypoints list, obstacles data, teams list, search grid boundary, ugv drive / drop locations, off-axis / emergent odlc locations, and RTL location for lost comms

  • Backend POSTs the requested data to the endpoint

  • Frontend requests the data and uses it for stuff

/interop/telemetry/

  • Live telemetry data of the plane that is being submitted to interop

  • Backend POSTs live telemetry data every time it submits to interop

  • Someone on the frontend can open up the endpoint in their browser and monitor the live updates

/interop/odlcs/<id>/<dtype>

  • Accessing ODLC data

  • Backend will query interop for an ODLC submission with the provided id, then return it

  • Frontend will GET the endpoint with parameters id and datatype to display on a client computer screen

/imaging/

/imaging/image_stream/<num>

  • Returns an image, represented as bytes in .png encoding

    • The <num> variable is the image number in the stream of images coming from the Jetson

/imaging/image_stream/boundaries

  • Returns the highest number image that has been received from the Jetson

/imaging/odcl_detection

  • Returns a list of dicts, and each dict has 2 objects: an image and a characterization

    • The image is represented as bytes in .png encoding

    • The interop data of an image includes its thing, thing, and thing

/imaging/map_individual/

  • Returns all the individual images used for the map stitching algorithms

  • Not used by frontend by default, but could be useful if we wanna view the individual images

/imaging/map_stitched

  • Returns the stitched map

PreviousFrontendNextComputer Vision

Last updated 4 years ago