Home >

Creating Your Own Development Environment

John Little- Watch Now - Duration: 32:46

Creating Your Own Development Environment
John Little
Modern software development tools have allowed more rapid, collaborative application development, but IoT and embedded software developers haven't always been able to best use these tools. Much of this is due to industry-defined development environments requiring clunky, non-automatic interfaces and platform-specific dependencies before you can deploy to your target. In this talk, I'll walk you through concepts and tools I use to build my environments, including versioning, virtualization, compilation, testing, continuous integration and deployment. After this talk, you'll be able to set up a snappy, replicable, and collaborative development environment to free your team from a constrained environment and build an integrated environment that best suits your needs.
italicssurround text with
boldsurround text with
**two asterisks**
or just a bare URL
surround text with
strikethroughsurround text with
~~two tilde characters~~
prefix with

Score: 1 | 2 years ago | 1 reply

John, that was a very interesting presentation. I particularly enjoyed your high level makefile. I had never thought of using make to execute commands like those you suggested. I've only been using make to compile code, clean, etc., the traditional way for decades. The auto generated help is really cool! Obviously you like thinking out of the box and I can see the benefits to the approach you have created.

John LittleSpeaker
Score: 1 | 2 years ago | no reply

Thank you! It's all put together from other pieces of software I've seen from friends and colleages. GNU Make is my favorite utility.

Score: 0 | 2 years ago | no reply

Great presentation. Certainly helpful.

Score: 0 | 2 years ago | no reply

quite an interesting approach.

Score: 1 | 2 years ago | 1 reply

I found your top-level makefiles quite interesting. I'm more used to the "standard" style of makefile, and mine tended to use named lists of files, rather than using shell operations to get them automatically. I think I'll try your way on my next project; it seems cleaner. As an aside, I did find that Robert Mecklenburg's book, "Managing Projects with GNU Make," gave me useful ideas and advice.

John LittleSpeaker
Score: 1 | 2 years ago | no reply

Definitely. In my first exposures to C/C++, I kept thinking, "why can't my computer automatically find every header file in the folder? Why do I have to put source files here, and include files in a different folder?" Once your makefile automatically finds the right files, you can add new source and header files really easily.

You might have a little difficulty defining include paths - instead of just finding files, you want to find every folder that has a header file within it. Here's some makefile code of mine that works for that (this comment has been edited to have a more elegant set of shell commands).

Edit: I've worked on it a little bit, and I now prefer the following for finding folders:

HEADERS := $(shell find $(SRC_DIR) -type f -name '*.h')
HEADER_FOLDERS := $(shell dirname $(HEADERS) | sort --unique)
IFLAGS := $(patsubst %, -I%, $(HEADER_FOLDERS))
Score: 2 | 2 years ago | 1 reply

Why use make to organize the tools/aliases instead of something like python?

John LittleSpeaker
Score: 2 | 2 years ago | no reply

In short: for simple applications, Make has a very, very easy learning curve, easier than python. I even use Make in my python-only projects, because of how easy it is!
For automatic C compilation, Make has a steep learning curve, but python can't even compete.

GNU Make is really convenient because it is built in to most linux installs, and has a pretty simple syntax. All you need to do to use it is to open an empty makefile, and type make into your command line.

If you were to do something similar in python, you'd need build a script that imports the os package and takes in a user-input alias to run a command-line script. There are several snags involved with this:

  1. Your command-line aliases become long and complicated anyways. make toast is simpler, I think, than python aliases.py toast. To run python commands, you'd need to use the command-line anyway, so why not use a tool that aliases these commands particularly well?
  2. If you're trying to collaborate or run on another computer, you are subject to the whims of the operating system's native python version - whether it's 2.7, 3.9, or if it's even installed at all!
  3. To get it all to run the same, you'd probably need to set up an environment with requirements.txt and open a virtual environment - and you'd need to do that in command line!

Ultimately, python is a general-purpose tool. You can use it to install packages and write scripts that do what you're asking in a more complicated way, but Make is a very lightweight, purpose built tool for executing command-line instructions, and it can do so with two lines in a simple, readable file.

Score: 2 | 2 years ago | 1 reply

Which references do you recommend to learn the virtualization process? For example, I want to create an avr gcc toolchain image for atmega serie 8 or arm gcc for raspberry pi pico, I love the CLIs. In addition, could you provide an email for contact?

John LittleSpeaker
Score: 1 | 2 years ago | no reply

As far as references go, I typically start by googling 'Dockerfile example,' and browsing 5-10 examples to see how other people create them. StackOverflow, blog posts, coding websites, or youtube channels if you prefer are all fine. I'd also definitely look at Docker's official instructions, and then, I'd start making my own Dockerfile.

I'd start with using a very simple Dockerfile - one that uses the FROM command to define your base image, and the WORKDIR command to set your entry point. Nothing else.

Then, I'd play around with the docker build command and its flags to turn the dockerfile into an image, and the docker run command to build a container from that image, and give you a command-line interface to your simple image.

Finally, I'd start installing all the tools I need while I'm in that image. Usually, this consists of me googling "install (tool name) command-line." You can install avr-gcc and avrdude, for example, with an apt-get command. I'd first run this command within the Docker container that I've set up, just to see if the command works or fails.

If that command works to install what I need, I'd set it in stone by moving the command into a RUN command within the dockerfile, and rebuilding the image and reopening the docker container.