calling sys.exit directly is so so ruuude.
I'm not testing every path right now, but this is vastly better already.
Signed-off-by: Karl Palsson <karlp@tweak.au>
Python2 is end-of-life [1] since the 1st of January 2020.
Some distributions (most notably: Debian and its derivatives) will stop
providing a `python` executable in order to encourage users to specify
the interpreter language of local scripts explicitly. Users of such
environments will be forced to work around this in one of these ways:
* create a virtual environment or
* manipulate the shebangs of the scripts or
* install the python2 package (as long as it is provided by distributions)
All currently maintained distribution releases provide python3.
In the near future distributions will need to remove python2, since it
is not maintained anymore.
PEP-394 [2] recommends to reference a specific python version
(python2 or python3), if the script is not expected to run in a virtual
environment.
Closes: #1265
[1] https://www.python.org/dev/peps/pep-0373/#update-april-2014
[2] https://www.python.org/dev/peps/pep-0394/#for-python-script-publishers
Amended-by: Karl Palsson <karlp@tweak.net.au>
* moved lpc43xx scripts to explicitly call python2, they have not been
ported, and are effectively unmaintained, but switching them to
python3 unconditionally would be unhelpful.
The nvic_ functions all had a broken link to an f1 list of irqs. Change
the header generator to generate a fixed name, and link to them.
Because of their scoping, this ok, they find the correct family's irq
definitions.
Generate doc root doxygenlayout file, as well as devices stuff, based on main
Makefile $(TARGETS) and template files. Avoids painfull sync/merge of 20 files+
when adding a new device.
bonus : allow to build only one device doc easily (make TARGETS=stm32f0 doc)
regression: we currently loose device "fancy" naming as device name is
guessed (toUpper()..) from folder name.
Much as we had to pull in the individual target's include files
manually, because they're useful without necessarily having .c files
using them, the cortex core headers also need to be included. This also
pulls in the doc-cm3.h file that setup nice groupings.
... since Clang doesn't infer the function type on '#pragma weak x = y'-style
declarations, and instead leaves it as "<overloaded function type>", thus
leading to a type conflict when assigning the ISRs to the interrupt vector.
This has no impact on normal use, but it makes it more compatible, nd
that's always a good thing.
Before (vector_nvic.c generated)
...
#pragma weak usart1_isr = blocking_handler
...
After:
...
void usart1_isr(void) __attribute__((weak, alias("blocking_handler")));
...
Until https://github.com/libopencm3/libopencm3/issues/732 has been
fixed, it's not enough to just have it in the README that you need GNU
awk. Explicitly use the "gawk" command name. This exists on (sane)
systems that have gawk as awk, and for systems that use mawk as default,
the gawk name should also exist.
This should make it significantly easier to diagnost the cause of build
problems.
Even if they haven't been referenced from a .c file. Some peripherals
start their life as register definitions only, and they should still
have the documentation generated.
There _will_ be overlap in the generated lists, but doxygen doesn't seem
to mind this.
Instead of the fragile and error prone attempts to specifically
include/exclude files from doxygen by name and pattern, simply use the
already generated .d files to provide accurate and up to date lists of
all source files used.
Pros:
* Nothing left to worry about
* Much more encouraging to actually _work_ on the documentation now that
you can be sure the right docs will be generated instead of a confusing
mix.
Downsides/Upsides:
* Automatically includes all CM3/USB in each device's page _as well_ now
Downsides:
* lpc43xx still manually listed. However, completely contained in it's
own dir, so no problems
* No attempt to carry this in latex. easy, but more tempted to drop
latex support outright. (I don't think the generation there has even
worked for a while now)
* Due to the mismatch between lib directories and document roots, the
sourcelist can't be magically created per directory. There has to be
some sort of mapping between the two, so as this is doc generation only,
a static list seems sane for maintennance. (Especially compared to the
old method)
* Source list generation probably doesn't work on windows.
As discussed with karlp on irc the devices.data file should not contain
gcc specific command line options.
For that reason the command line options for gcc are now generated from
the variables CPU and FPU by the rules in the mk directory.
This breaks the genlink tests.
genlink: simplified devices.data
devices.data already had the information about the family name.
By using the first field (by the pattern used to match it) as family name information that data doesn't
have to be provided explicitly. The same data is used to generate the
CPPFLAGS, such as -DSTM32F1
The architectures block of the devices.data file was redundant.
genlink-config.mk uses family and subfamily to figure out which libopencm3
variant actually exists.
These prototypes affect functions defined by application code. Only
the implementations in libopencm3 are supposed to be weak; the
functions in application code should definitely not be. Otherwise,
you'll end up with two weak symbols being linked together, and
it's luck as to which one the linker picks.
in ARCH, there are all -m flags (will be expanded into ARCH_FLAGS in Makefile)
in DEFS, there are all -D flags (will be expanded into DEFS in Makefile)
in LIB, there are all -l flags (will be expanded into LIBNAME in Makefile)
If no MODE option specified, the generator behaves as in previous version.
This makes possibility for the script to append the definitions to CFLAGS
and LDFLAGS, and with the feature of disabling of -D prependation it will
make possible to generate ARCH_FLAGS generic to each specific chip.
This converts all the YAML files to JSON files, as json parsing is built
into python instead of being a separate library requiring installation.
YAML is a superset of JSON, but putting comments in is not quite as obvious
as it is in yaml.
The following glue was used to convert yaml to json:
python -c 'import sys, yaml, json; json.dump(yaml.load(sys.stdin), sys.stdout, indent=4)' < $1 > $2
Clearly I haven't tested this on every single platform, and this
doesn't address the large blobs of yaml in the lpc4300 scripts directory,
only the cortex NVIC generation process.
I've tested a few IRQ driven example apps, and I've checked the generated
output of some known cases like the LM3s that has explicit gaps, and they are
all generated correctly.