macOS
You can use your Intel or M1-based Mac computer as a fully-supported development environment for Edge Impulse for Linux. This lets you sample raw data, build models, and deploy trained machine learning models directly from the Studio. If you have a Macbook, the webcam and microphone of your system are automatically detected and can be used to build models.
Macbook Pro

1. Connecting to your Mac

To connect your Mac to Edge Impulse:
  1. 1.
    Install Node.js.
  2. 2.
    Install Homebrew.
  3. 3.
    Open a terminal window and install the dependencies:
1
$ brew install sox
2
$ brew install imagesnap
Copied!
  1. 1.
    Last, install the Edge Impulse CLI:
1
$ npm install edge-impulse-linux -g
Copied!
Problems installing the CLI?

2. Connecting to Edge Impulse

With the software installed, open a terminal window and run::
1
$ edge-impulse-linux
Copied!
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean.

3. Verifying that your device is connected

That's all! Your Mac is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Device connected to Edge Impulse.

Next steps: building a machine learning model

With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.

Deploying back to device

To run your impulse locally, just open a terminal and run:
1
$ edge-impulse-linux-runner
Copied!
This will automatically compile your model with full hardware acceleration, download the model to your Raspberry Pi, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.

Image model?

If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
Live feed with classification results