Browse Source

Merge pull request #929 from svaarala/prepare-sources-cleanups

Cleanups for dist and prepare-and-config split
pull/935/head
Sami Vaarala 8 years ago
committed by GitHub
parent
commit
13f4539b2a
  1. 6
      Makefile
  2. 6
      RELEASES.rst
  3. 2
      doc/bytecode.rst
  4. 2
      doc/code-issues.rst
  5. 15
      doc/low-memory.rst
  6. 2
      doc/release-checklist.rst
  7. 37
      doc/release-notes-v2-0.rst
  8. 6
      examples/cmdline/duk_cmdline_ajduk.c
  9. 0
      misc/CaseConversion.java
  10. 1
      src/SpecialCasing-8bit.txt
  11. 256
      src/UnicodeData-8bit.txt
  12. 2
      src/duk_bi_date.c
  13. 2
      src/duk_heap_hashstring.c
  14. 2
      src/duk_numconv.c
  15. 14
      src/duk_unicode_support.c
  16. 4
      src/duk_unicode_tables.c
  17. 45
      testrunner/client-simple-node/run_commit_test.py
  18. 11
      tools/combine_src.py
  19. 26
      tools/extract_caseconv.py
  20. 13
      tools/extract_chars.py
  21. 44
      tools/genbuildparams.py
  22. 32
      tools/genbuiltins.py
  23. 63
      tools/genconfig.py
  24. 112
      tools/prepare_sources.py
  25. 2
      util/bluebird-test-shim.js
  26. 2
      util/check_align.c
  27. 2
      util/check_code_policy.py
  28. 2
      util/example_rombuild.sh
  29. 31
      util/example_user_builtins1.yaml
  30. 2
      util/fastint_reps.py
  31. 1
      util/format_perftest.py
  32. 7
      util/gendoubleconsts.py
  33. 1
      util/genhashsizes.py
  34. 1
      util/gennumdigits.py
  35. 492
      util/make_dist.py
  36. 2
      util/prep_test.py
  37. 64
      util/remove_bufferobject_properties.yaml
  38. 8
      util/remove_math_properties.yaml
  39. 10
      util/underscore-test-shim.js
  40. 1
      util/underscore_test.sh

6
Makefile

@ -342,7 +342,7 @@ clean:
@rm -f duktape-*.tar.* @rm -f duktape-*.tar.*
@rm -f duktape-*.iso @rm -f duktape-*.iso
@rm -f doc/*.html @rm -f doc/*.html
@rm -f src/*.pyc @rm -f src/*.pyc tools/*.pyc util/*.pyc
@rm -rf massif.out.* ms_print.tmp.* @rm -rf massif.out.* ms_print.tmp.*
@rm -rf cachegrind.out.* @rm -rf cachegrind.out.*
@rm -rf callgrind.out.* @rm -rf callgrind.out.*
@ -488,7 +488,7 @@ endif
.PHONY: duksizes .PHONY: duksizes
duksizes: duk.raw duksizes: duk.raw
$(PYTHON) src/genexesizereport.py $< > /tmp/duk_sizes.html $(PYTHON) tools/genexesizereport.py $< > /tmp/duk_sizes.html
.PHONY: issuecount .PHONY: issuecount
issuecount: issuecount:
@ -1154,7 +1154,7 @@ codepolicycheck:
--check-mixed-indent \ --check-mixed-indent \
--check-tab-indent \ --check-tab-indent \
--dump-vim-commands \ --dump-vim-commands \
src/*.py tools/*.py util/*.py debugger/*/*.py examples/*/*.py src/*.py tools/*.py util/*.py debugger/*/*.py examples/*/*.py testrunner/*/*.py
@$(PYTHON) util/check_code_policy.py \ @$(PYTHON) util/check_code_policy.py \
$(CODEPOLICYOPTS) \ $(CODEPOLICYOPTS) \
--check-debug-log-calls \ --check-debug-log-calls \

6
RELEASES.rst

@ -1715,6 +1715,12 @@ Planned
and constructor call argument count from 511 to 255, and maximum Ecmascript and constructor call argument count from 511 to 255, and maximum Ecmascript
function constant count from 262144 to 65536 (GH-903) function constant count from 262144 to 65536 (GH-903)
* Incompatible change: genconfig.py has been relocated to tools/genconfig.py
in the end user distributable (GH-929)
* Incompatible change: make_dist.py no longer supports ROM built-ins, use
tools/prepare_sources.py instead (GH-929)
* Include raw input sources and a prepare-and-config tool in the distributable, * Include raw input sources and a prepare-and-config tool in the distributable,
which allow user code to regenerate a config file and source code files for which allow user code to regenerate a config file and source code files for
specified options; this is more comprehensive than just running genconfig.py specified options; this is more comprehensive than just running genconfig.py

2
doc/bytecode.rst

@ -334,7 +334,7 @@ The exact format is ultimately defined by the source code, see:
* ``src/duk_api_bytecode.c`` * ``src/duk_api_bytecode.c``
* ``util/dump_bytecode.py`` * ``tools/dump_bytecode.py``
As a simplified summary of the bytecode format: As a simplified summary of the bytecode format:

2
doc/code-issues.rst

@ -811,7 +811,7 @@ The ``DUK_INTERNAL_DECL`` idiom is::
DUK_INTERNAL_DECL const char *duk_str_not_object; DUK_INTERNAL_DECL const char *duk_str_not_object;
#endif /* !DUK_SINGLE_FILE */ #endif /* !DUK_SINGLE_FILE */
For this to work in the single file case, ``util/combine_src.py`` must For this to work in the single file case, ``tools/combine_src.py`` must
ensure that the symbol definition appears before its use. This is currently ensure that the symbol definition appears before its use. This is currently
handled via manual file reordering. handled via manual file reordering.

15
doc/low-memory.rst

@ -158,8 +158,8 @@ Miscellaneous
stub out unnecessary functions in ``duk_config.h``. Note, however, that stub out unnecessary functions in ``duk_config.h``. Note, however, that
Duktape internals at present depend on a few Math functions like ``DUK_FMOD()``. Duktape internals at present depend on a few Math functions like ``DUK_FMOD()``.
Suggested feature options Suggested options
========================= =================
* Use the default memory management settings: although reference counting * Use the default memory management settings: although reference counting
increases heap header size, it also reduces memory usage fluctuation increases heap header size, it also reduces memory usage fluctuation
@ -232,6 +232,17 @@ Suggested feature options
- ``-DDUK_OPT_DEBUG_BUFSIZE=2048`` - ``-DDUK_OPT_DEBUG_BUFSIZE=2048``
* If strict Unicode support is not critical in your application, you can:
- Strip the ``UnicodeData.txt`` and ``SpecialCasing.txt`` files manually.
There are example files in the distributable for Unicode data limited
to 8-bit codepoints.
- Provide the stripped files to ``prepare_sources.py`` to reduce Unicode
table size.
- Possible footprint savings are about 2-3kB.
More aggressive options More aggressive options
======================= =======================

2
doc/release-checklist.rst

@ -41,7 +41,7 @@ Checklist for ordinary releases
- Check year range - Check year range
- Also check ``util/create_spdx_license.py`` - Also check ``tools/create_spdx_license.py``
* Ensure RELEASES.rst is up-to-date (must be done before candidate tar.xz * Ensure RELEASES.rst is up-to-date (must be done before candidate tar.xz
build because dist package contains RELEASES.rst) build because dist package contains RELEASES.rst)

37
doc/release-notes-v2-0.rst

@ -67,6 +67,43 @@ DUK_OPT_xxx feature option support removed
FIXME. FIXME.
Tooling changes
---------------
There are some tooling changes in this release:
* The distributable now includes raw sources (``src/`` in Duktape main repo)
in ``src-input/`` and some tooling in ``tools/``.
* The tooling includes a new ``tools/prepare_sources.py`` tool which creates
a ``duk_config.h`` and matching prepared sources simultaneously. This
allows use of ROM built-ins from the distributable (previously required a
manual ``make_dist.py --rom-support ...`` command.
* The ``make_dist.py`` utility in Duktape main repo no longer supports
``--rom-support``, ``--rom-auto-lightfunc``, and ``--user-builtin-metadata``
options. Use the ``tools/prepare_sources.py`` tool instead, which supports
these options.
* The distributable still includes sources prepared using default configuration
(``src/``, ``src-noline/``, and ``src-separate``) and some configuration
examples.
* The ``config/genconfig.py`` has been relocated to ``tools/genconfig.py`` in
the distributable. It can still be used as a standalone tool, but over time
the intent is that configuration and sources are prepared in one atomic step.
To upgrade:
* If you're just using the default sources and ``duk_config.h`` in the
distributable, no changes are needed.
* If you're using ``genconfig.py``, check the path; correct path is now
``tools/genconfig.py``.
* If you're using ROM built-ins via ``make_dist.py``, change your build to
use ``tools/prepare_sources.py`` instead.
Buffer behavior changes Buffer behavior changes
----------------------- -----------------------

6
examples/cmdline/duk_cmdline_ajduk.c

@ -508,14 +508,14 @@ void ajsheap_extstr_free_1(const void *ptr) {
* binary. * binary.
* *
* Duktape built-in strings are available from duk_build_meta.json, see * Duktape built-in strings are available from duk_build_meta.json, see
* util/duk_meta_to_strarray.py. There may also be a lot of application * tools/duk_meta_to_strarray.py. There may also be a lot of application
* specific strings, e.g. those used by application specific APIs. These * specific strings, e.g. those used by application specific APIs. These
* must be gathered through some other means, see e.g. util/scan_strings.py. * must be gathered through some other means, see e.g. tools/scan_strings.py.
*/ */
static const char *strdata_duk_builtin_strings[] = { static const char *strdata_duk_builtin_strings[] = {
/* /*
* These strings are from util/duk_meta_to_strarray.py * These strings are from tools/duk_meta_to_strarray.py
*/ */
"Logger", "Logger",

0
src/CaseConversion.java → misc/CaseConversion.java

1
src/SpecialCasing-8bit.txt

@ -0,0 +1 @@
00DF; 00DF; 0053 0073; 0053 0053; # LATIN SMALL LETTER SHARP S

256
src/UnicodeData-8bit.txt

@ -0,0 +1,256 @@
0000;<control>;Cc;0;BN;;;;;N;NULL;;;;
0001;<control>;Cc;0;BN;;;;;N;START OF HEADING;;;;
0002;<control>;Cc;0;BN;;;;;N;START OF TEXT;;;;
0003;<control>;Cc;0;BN;;;;;N;END OF TEXT;;;;
0004;<control>;Cc;0;BN;;;;;N;END OF TRANSMISSION;;;;
0005;<control>;Cc;0;BN;;;;;N;ENQUIRY;;;;
0006;<control>;Cc;0;BN;;;;;N;ACKNOWLEDGE;;;;
0007;<control>;Cc;0;BN;;;;;N;BELL;;;;
0008;<control>;Cc;0;BN;;;;;N;BACKSPACE;;;;
0009;<control>;Cc;0;S;;;;;N;CHARACTER TABULATION;;;;
000A;<control>;Cc;0;B;;;;;N;LINE FEED (LF);;;;
000B;<control>;Cc;0;S;;;;;N;LINE TABULATION;;;;
000C;<control>;Cc;0;WS;;;;;N;FORM FEED (FF);;;;
000D;<control>;Cc;0;B;;;;;N;CARRIAGE RETURN (CR);;;;
000E;<control>;Cc;0;BN;;;;;N;SHIFT OUT;;;;
000F;<control>;Cc;0;BN;;;;;N;SHIFT IN;;;;
0010;<control>;Cc;0;BN;;;;;N;DATA LINK ESCAPE;;;;
0011;<control>;Cc;0;BN;;;;;N;DEVICE CONTROL ONE;;;;
0012;<control>;Cc;0;BN;;;;;N;DEVICE CONTROL TWO;;;;
0013;<control>;Cc;0;BN;;;;;N;DEVICE CONTROL THREE;;;;
0014;<control>;Cc;0;BN;;;;;N;DEVICE CONTROL FOUR;;;;
0015;<control>;Cc;0;BN;;;;;N;NEGATIVE ACKNOWLEDGE;;;;
0016;<control>;Cc;0;BN;;;;;N;SYNCHRONOUS IDLE;;;;
0017;<control>;Cc;0;BN;;;;;N;END OF TRANSMISSION BLOCK;;;;
0018;<control>;Cc;0;BN;;;;;N;CANCEL;;;;
0019;<control>;Cc;0;BN;;;;;N;END OF MEDIUM;;;;
001A;<control>;Cc;0;BN;;;;;N;SUBSTITUTE;;;;
001B;<control>;Cc;0;BN;;;;;N;ESCAPE;;;;
001C;<control>;Cc;0;B;;;;;N;INFORMATION SEPARATOR FOUR;;;;
001D;<control>;Cc;0;B;;;;;N;INFORMATION SEPARATOR THREE;;;;
001E;<control>;Cc;0;B;;;;;N;INFORMATION SEPARATOR TWO;;;;
001F;<control>;Cc;0;S;;;;;N;INFORMATION SEPARATOR ONE;;;;
0020;SPACE;Zs;0;WS;;;;;N;;;;;
0021;EXCLAMATION MARK;Po;0;ON;;;;;N;;;;;
0022;QUOTATION MARK;Po;0;ON;;;;;N;;;;;
0023;NUMBER SIGN;Po;0;ET;;;;;N;;;;;
0024;DOLLAR SIGN;Sc;0;ET;;;;;N;;;;;
0025;PERCENT SIGN;Po;0;ET;;;;;N;;;;;
0026;AMPERSAND;Po;0;ON;;;;;N;;;;;
0027;APOSTROPHE;Po;0;ON;;;;;N;APOSTROPHE-QUOTE;;;;
0028;LEFT PARENTHESIS;Ps;0;ON;;;;;Y;OPENING PARENTHESIS;;;;
0029;RIGHT PARENTHESIS;Pe;0;ON;;;;;Y;CLOSING PARENTHESIS;;;;
002A;ASTERISK;Po;0;ON;;;;;N;;;;;
002B;PLUS SIGN;Sm;0;ES;;;;;N;;;;;
002C;COMMA;Po;0;CS;;;;;N;;;;;
002D;HYPHEN-MINUS;Pd;0;ES;;;;;N;;;;;
002E;FULL STOP;Po;0;CS;;;;;N;PERIOD;;;;
002F;SOLIDUS;Po;0;CS;;;;;N;SLASH;;;;
0030;DIGIT ZERO;Nd;0;EN;;0;0;0;N;;;;;
0031;DIGIT ONE;Nd;0;EN;;1;1;1;N;;;;;
0032;DIGIT TWO;Nd;0;EN;;2;2;2;N;;;;;
0033;DIGIT THREE;Nd;0;EN;;3;3;3;N;;;;;
0034;DIGIT FOUR;Nd;0;EN;;4;4;4;N;;;;;
0035;DIGIT FIVE;Nd;0;EN;;5;5;5;N;;;;;
0036;DIGIT SIX;Nd;0;EN;;6;6;6;N;;;;;
0037;DIGIT SEVEN;Nd;0;EN;;7;7;7;N;;;;;
0038;DIGIT EIGHT;Nd;0;EN;;8;8;8;N;;;;;
0039;DIGIT NINE;Nd;0;EN;;9;9;9;N;;;;;
003A;COLON;Po;0;CS;;;;;N;;;;;
003B;SEMICOLON;Po;0;ON;;;;;N;;;;;
003C;LESS-THAN SIGN;Sm;0;ON;;;;;Y;;;;;
003D;EQUALS SIGN;Sm;0;ON;;;;;N;;;;;
003E;GREATER-THAN SIGN;Sm;0;ON;;;;;Y;;;;;
003F;QUESTION MARK;Po;0;ON;;;;;N;;;;;
0040;COMMERCIAL AT;Po;0;ON;;;;;N;;;;;
0041;LATIN CAPITAL LETTER A;Lu;0;L;;;;;N;;;;0061;
0042;LATIN CAPITAL LETTER B;Lu;0;L;;;;;N;;;;0062;
0043;LATIN CAPITAL LETTER C;Lu;0;L;;;;;N;;;;0063;
0044;LATIN CAPITAL LETTER D;Lu;0;L;;;;;N;;;;0064;
0045;LATIN CAPITAL LETTER E;Lu;0;L;;;;;N;;;;0065;
0046;LATIN CAPITAL LETTER F;Lu;0;L;;;;;N;;;;0066;
0047;LATIN CAPITAL LETTER G;Lu;0;L;;;;;N;;;;0067;
0048;LATIN CAPITAL LETTER H;Lu;0;L;;;;;N;;;;0068;
0049;LATIN CAPITAL LETTER I;Lu;0;L;;;;;N;;;;0069;
004A;LATIN CAPITAL LETTER J;Lu;0;L;;;;;N;;;;006A;
004B;LATIN CAPITAL LETTER K;Lu;0;L;;;;;N;;;;006B;
004C;LATIN CAPITAL LETTER L;Lu;0;L;;;;;N;;;;006C;
004D;LATIN CAPITAL LETTER M;Lu;0;L;;;;;N;;;;006D;
004E;LATIN CAPITAL LETTER N;Lu;0;L;;;;;N;;;;006E;
004F;LATIN CAPITAL LETTER O;Lu;0;L;;;;;N;;;;006F;
0050;LATIN CAPITAL LETTER P;Lu;0;L;;;;;N;;;;0070;
0051;LATIN CAPITAL LETTER Q;Lu;0;L;;;;;N;;;;0071;
0052;LATIN CAPITAL LETTER R;Lu;0;L;;;;;N;;;;0072;
0053;LATIN CAPITAL LETTER S;Lu;0;L;;;;;N;;;;0073;
0054;LATIN CAPITAL LETTER T;Lu;0;L;;;;;N;;;;0074;
0055;LATIN CAPITAL LETTER U;Lu;0;L;;;;;N;;;;0075;
0056;LATIN CAPITAL LETTER V;Lu;0;L;;;;;N;;;;0076;
0057;LATIN CAPITAL LETTER W;Lu;0;L;;;;;N;;;;0077;
0058;LATIN CAPITAL LETTER X;Lu;0;L;;;;;N;;;;0078;
0059;LATIN CAPITAL LETTER Y;Lu;0;L;;;;;N;;;;0079;
005A;LATIN CAPITAL LETTER Z;Lu;0;L;;;;;N;;;;007A;
005B;LEFT SQUARE BRACKET;Ps;0;ON;;;;;Y;OPENING SQUARE BRACKET;;;;
005C;REVERSE SOLIDUS;Po;0;ON;;;;;N;BACKSLASH;;;;
005D;RIGHT SQUARE BRACKET;Pe;0;ON;;;;;Y;CLOSING SQUARE BRACKET;;;;
005E;CIRCUMFLEX ACCENT;Sk;0;ON;;;;;N;SPACING CIRCUMFLEX;;;;
005F;LOW LINE;Pc;0;ON;;;;;N;SPACING UNDERSCORE;;;;
0060;GRAVE ACCENT;Sk;0;ON;;;;;N;SPACING GRAVE;;;;
0061;LATIN SMALL LETTER A;Ll;0;L;;;;;N;;;0041;;0041
0062;LATIN SMALL LETTER B;Ll;0;L;;;;;N;;;0042;;0042
0063;LATIN SMALL LETTER C;Ll;0;L;;;;;N;;;0043;;0043
0064;LATIN SMALL LETTER D;Ll;0;L;;;;;N;;;0044;;0044
0065;LATIN SMALL LETTER E;Ll;0;L;;;;;N;;;0045;;0045
0066;LATIN SMALL LETTER F;Ll;0;L;;;;;N;;;0046;;0046
0067;LATIN SMALL LETTER G;Ll;0;L;;;;;N;;;0047;;0047
0068;LATIN SMALL LETTER H;Ll;0;L;;;;;N;;;0048;;0048
0069;LATIN SMALL LETTER I;Ll;0;L;;;;;N;;;0049;;0049
006A;LATIN SMALL LETTER J;Ll;0;L;;;;;N;;;004A;;004A
006B;LATIN SMALL LETTER K;Ll;0;L;;;;;N;;;004B;;004B
006C;LATIN SMALL LETTER L;Ll;0;L;;;;;N;;;004C;;004C
006D;LATIN SMALL LETTER M;Ll;0;L;;;;;N;;;004D;;004D
006E;LATIN SMALL LETTER N;Ll;0;L;;;;;N;;;004E;;004E
006F;LATIN SMALL LETTER O;Ll;0;L;;;;;N;;;004F;;004F
0070;LATIN SMALL LETTER P;Ll;0;L;;;;;N;;;0050;;0050
0071;LATIN SMALL LETTER Q;Ll;0;L;;;;;N;;;0051;;0051
0072;LATIN SMALL LETTER R;Ll;0;L;;;;;N;;;0052;;0052
0073;LATIN SMALL LETTER S;Ll;0;L;;;;;N;;;0053;;0053
0074;LATIN SMALL LETTER T;Ll;0;L;;;;;N;;;0054;;0054
0075;LATIN SMALL LETTER U;Ll;0;L;;;;;N;;;0055;;0055
0076;LATIN SMALL LETTER V;Ll;0;L;;;;;N;;;0056;;0056
0077;LATIN SMALL LETTER W;Ll;0;L;;;;;N;;;0057;;0057
0078;LATIN SMALL LETTER X;Ll;0;L;;;;;N;;;0058;;0058
0079;LATIN SMALL LETTER Y;Ll;0;L;;;;;N;;;0059;;0059
007A;LATIN SMALL LETTER Z;Ll;0;L;;;;;N;;;005A;;005A
007B;LEFT CURLY BRACKET;Ps;0;ON;;;;;Y;OPENING CURLY BRACKET;;;;
007C;VERTICAL LINE;Sm;0;ON;;;;;N;VERTICAL BAR;;;;
007D;RIGHT CURLY BRACKET;Pe;0;ON;;;;;Y;CLOSING CURLY BRACKET;;;;
007E;TILDE;Sm;0;ON;;;;;N;;;;;
007F;<control>;Cc;0;BN;;;;;N;DELETE;;;;
0080;<control>;Cc;0;BN;;;;;N;;;;;
0081;<control>;Cc;0;BN;;;;;N;;;;;
0082;<control>;Cc;0;BN;;;;;N;BREAK PERMITTED HERE;;;;
0083;<control>;Cc;0;BN;;;;;N;NO BREAK HERE;;;;
0084;<control>;Cc;0;BN;;;;;N;;;;;
0085;<control>;Cc;0;B;;;;;N;NEXT LINE (NEL);;;;
0086;<control>;Cc;0;BN;;;;;N;START OF SELECTED AREA;;;;
0087;<control>;Cc;0;BN;;;;;N;END OF SELECTED AREA;;;;
0088;<control>;Cc;0;BN;;;;;N;CHARACTER TABULATION SET;;;;
0089;<control>;Cc;0;BN;;;;;N;CHARACTER TABULATION WITH JUSTIFICATION;;;;
008A;<control>;Cc;0;BN;;;;;N;LINE TABULATION SET;;;;
008B;<control>;Cc;0;BN;;;;;N;PARTIAL LINE FORWARD;;;;
008C;<control>;Cc;0;BN;;;;;N;PARTIAL LINE BACKWARD;;;;
008D;<control>;Cc;0;BN;;;;;N;REVERSE LINE FEED;;;;
008E;<control>;Cc;0;BN;;;;;N;SINGLE SHIFT TWO;;;;
008F;<control>;Cc;0;BN;;;;;N;SINGLE SHIFT THREE;;;;
0090;<control>;Cc;0;BN;;;;;N;DEVICE CONTROL STRING;;;;
0091;<control>;Cc;0;BN;;;;;N;PRIVATE USE ONE;;;;
0092;<control>;Cc;0;BN;;;;;N;PRIVATE USE TWO;;;;
0093;<control>;Cc;0;BN;;;;;N;SET TRANSMIT STATE;;;;
0094;<control>;Cc;0;BN;;;;;N;CANCEL CHARACTER;;;;
0095;<control>;Cc;0;BN;;;;;N;MESSAGE WAITING;;;;
0096;<control>;Cc;0;BN;;;;;N;START OF GUARDED AREA;;;;
0097;<control>;Cc;0;BN;;;;;N;END OF GUARDED AREA;;;;
0098;<control>;Cc;0;BN;;;;;N;START OF STRING;;;;
0099;<control>;Cc;0;BN;;;;;N;;;;;
009A;<control>;Cc;0;BN;;;;;N;SINGLE CHARACTER INTRODUCER;;;;
009B;<control>;Cc;0;BN;;;;;N;CONTROL SEQUENCE INTRODUCER;;;;
009C;<control>;Cc;0;BN;;;;;N;STRING TERMINATOR;;;;
009D;<control>;Cc;0;BN;;;;;N;OPERATING SYSTEM COMMAND;;;;
009E;<control>;Cc;0;BN;;;;;N;PRIVACY MESSAGE;;;;
009F;<control>;Cc;0;BN;;;;;N;APPLICATION PROGRAM COMMAND;;;;
00A0;NO-BREAK SPACE;Zs;0;CS;<noBreak> 0020;;;;N;NON-BREAKING SPACE;;;;
00A1;INVERTED EXCLAMATION MARK;Po;0;ON;;;;;N;;;;;
00A2;CENT SIGN;Sc;0;ET;;;;;N;;;;;
00A3;POUND SIGN;Sc;0;ET;;;;;N;;;;;
00A4;CURRENCY SIGN;Sc;0;ET;;;;;N;;;;;
00A5;YEN SIGN;Sc;0;ET;;;;;N;;;;;
00A6;BROKEN BAR;So;0;ON;;;;;N;BROKEN VERTICAL BAR;;;;
00A7;SECTION SIGN;Po;0;ON;;;;;N;;;;;
00A8;DIAERESIS;Sk;0;ON;<compat> 0020 0308;;;;N;SPACING DIAERESIS;;;;
00A9;COPYRIGHT SIGN;So;0;ON;;;;;N;;;;;
00AA;FEMININE ORDINAL INDICATOR;Lo;0;L;<super> 0061;;;;N;;;;;
00AB;LEFT-POINTING DOUBLE ANGLE QUOTATION MARK;Pi;0;ON;;;;;Y;LEFT POINTING GUILLEMET;;;;
00AC;NOT SIGN;Sm;0;ON;;;;;N;;;;;
00AD;SOFT HYPHEN;Cf;0;BN;;;;;N;;;;;
00AE;REGISTERED SIGN;So;0;ON;;;;;N;REGISTERED TRADE MARK SIGN;;;;
00AF;MACRON;Sk;0;ON;<compat> 0020 0304;;;;N;SPACING MACRON;;;;
00B0;DEGREE SIGN;So;0;ET;;;;;N;;;;;
00B1;PLUS-MINUS SIGN;Sm;0;ET;;;;;N;PLUS-OR-MINUS SIGN;;;;
00B2;SUPERSCRIPT TWO;No;0;EN;<super> 0032;;2;2;N;SUPERSCRIPT DIGIT TWO;;;;
00B3;SUPERSCRIPT THREE;No;0;EN;<super> 0033;;3;3;N;SUPERSCRIPT DIGIT THREE;;;;
00B4;ACUTE ACCENT;Sk;0;ON;<compat> 0020 0301;;;;N;SPACING ACUTE;;;;
00B5;MICRO SIGN;Ll;0;L;<compat> 03BC;;;;N;;;039C;;039C
00B6;PILCROW SIGN;Po;0;ON;;;;;N;PARAGRAPH SIGN;;;;
00B7;MIDDLE DOT;Po;0;ON;;;;;N;;;;;
00B8;CEDILLA;Sk;0;ON;<compat> 0020 0327;;;;N;SPACING CEDILLA;;;;
00B9;SUPERSCRIPT ONE;No;0;EN;<super> 0031;;1;1;N;SUPERSCRIPT DIGIT ONE;;;;
00BA;MASCULINE ORDINAL INDICATOR;Lo;0;L;<super> 006F;;;;N;;;;;
00BB;RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK;Pf;0;ON;;;;;Y;RIGHT POINTING GUILLEMET;;;;
00BC;VULGAR FRACTION ONE QUARTER;No;0;ON;<fraction> 0031 2044 0034;;;1/4;N;FRACTION ONE QUARTER;;;;
00BD;VULGAR FRACTION ONE HALF;No;0;ON;<fraction> 0031 2044 0032;;;1/2;N;FRACTION ONE HALF;;;;
00BE;VULGAR FRACTION THREE QUARTERS;No;0;ON;<fraction> 0033 2044 0034;;;3/4;N;FRACTION THREE QUARTERS;;;;
00BF;INVERTED QUESTION MARK;Po;0;ON;;;;;N;;;;;
00C0;LATIN CAPITAL LETTER A WITH GRAVE;Lu;0;L;0041 0300;;;;N;LATIN CAPITAL LETTER A GRAVE;;;00E0;
00C1;LATIN CAPITAL LETTER A WITH ACUTE;Lu;0;L;0041 0301;;;;N;LATIN CAPITAL LETTER A ACUTE;;;00E1;
00C2;LATIN CAPITAL LETTER A WITH CIRCUMFLEX;Lu;0;L;0041 0302;;;;N;LATIN CAPITAL LETTER A CIRCUMFLEX;;;00E2;
00C3;LATIN CAPITAL LETTER A WITH TILDE;Lu;0;L;0041 0303;;;;N;LATIN CAPITAL LETTER A TILDE;;;00E3;
00C4;LATIN CAPITAL LETTER A WITH DIAERESIS;Lu;0;L;0041 0308;;;;N;LATIN CAPITAL LETTER A DIAERESIS;;;00E4;
00C5;LATIN CAPITAL LETTER A WITH RING ABOVE;Lu;0;L;0041 030A;;;;N;LATIN CAPITAL LETTER A RING;;;00E5;
00C6;LATIN CAPITAL LETTER AE;Lu;0;L;;;;;N;LATIN CAPITAL LETTER A E;;;00E6;
00C7;LATIN CAPITAL LETTER C WITH CEDILLA;Lu;0;L;0043 0327;;;;N;LATIN CAPITAL LETTER C CEDILLA;;;00E7;
00C8;LATIN CAPITAL LETTER E WITH GRAVE;Lu;0;L;0045 0300;;;;N;LATIN CAPITAL LETTER E GRAVE;;;00E8;
00C9;LATIN CAPITAL LETTER E WITH ACUTE;Lu;0;L;0045 0301;;;;N;LATIN CAPITAL LETTER E ACUTE;;;00E9;
00CA;LATIN CAPITAL LETTER E WITH CIRCUMFLEX;Lu;0;L;0045 0302;;;;N;LATIN CAPITAL LETTER E CIRCUMFLEX;;;00EA;
00CB;LATIN CAPITAL LETTER E WITH DIAERESIS;Lu;0;L;0045 0308;;;;N;LATIN CAPITAL LETTER E DIAERESIS;;;00EB;
00CC;LATIN CAPITAL LETTER I WITH GRAVE;Lu;0;L;0049 0300;;;;N;LATIN CAPITAL LETTER I GRAVE;;;00EC;
00CD;LATIN CAPITAL LETTER I WITH ACUTE;Lu;0;L;0049 0301;;;;N;LATIN CAPITAL LETTER I ACUTE;;;00ED;
00CE;LATIN CAPITAL LETTER I WITH CIRCUMFLEX;Lu;0;L;0049 0302;;;;N;LATIN CAPITAL LETTER I CIRCUMFLEX;;;00EE;
00CF;LATIN CAPITAL LETTER I WITH DIAERESIS;Lu;0;L;0049 0308;;;;N;LATIN CAPITAL LETTER I DIAERESIS;;;00EF;
00D0;LATIN CAPITAL LETTER ETH;Lu;0;L;;;;;N;;;;00F0;
00D1;LATIN CAPITAL LETTER N WITH TILDE;Lu;0;L;004E 0303;;;;N;LATIN CAPITAL LETTER N TILDE;;;00F1;
00D2;LATIN CAPITAL LETTER O WITH GRAVE;Lu;0;L;004F 0300;;;;N;LATIN CAPITAL LETTER O GRAVE;;;00F2;
00D3;LATIN CAPITAL LETTER O WITH ACUTE;Lu;0;L;004F 0301;;;;N;LATIN CAPITAL LETTER O ACUTE;;;00F3;
00D4;LATIN CAPITAL LETTER O WITH CIRCUMFLEX;Lu;0;L;004F 0302;;;;N;LATIN CAPITAL LETTER O CIRCUMFLEX;;;00F4;
00D5;LATIN CAPITAL LETTER O WITH TILDE;Lu;0;L;004F 0303;;;;N;LATIN CAPITAL LETTER O TILDE;;;00F5;
00D6;LATIN CAPITAL LETTER O WITH DIAERESIS;Lu;0;L;004F 0308;;;;N;LATIN CAPITAL LETTER O DIAERESIS;;;00F6;
00D7;MULTIPLICATION SIGN;Sm;0;ON;;;;;N;;;;;
00D8;LATIN CAPITAL LETTER O WITH STROKE;Lu;0;L;;;;;N;LATIN CAPITAL LETTER O SLASH;;;00F8;
00D9;LATIN CAPITAL LETTER U WITH GRAVE;Lu;0;L;0055 0300;;;;N;LATIN CAPITAL LETTER U GRAVE;;;00F9;
00DA;LATIN CAPITAL LETTER U WITH ACUTE;Lu;0;L;0055 0301;;;;N;LATIN CAPITAL LETTER U ACUTE;;;00FA;
00DB;LATIN CAPITAL LETTER U WITH CIRCUMFLEX;Lu;0;L;0055 0302;;;;N;LATIN CAPITAL LETTER U CIRCUMFLEX;;;00FB;
00DC;LATIN CAPITAL LETTER U WITH DIAERESIS;Lu;0;L;0055 0308;;;;N;LATIN CAPITAL LETTER U DIAERESIS;;;00FC;
00DD;LATIN CAPITAL LETTER Y WITH ACUTE;Lu;0;L;0059 0301;;;;N;LATIN CAPITAL LETTER Y ACUTE;;;00FD;
00DE;LATIN CAPITAL LETTER THORN;Lu;0;L;;;;;N;;;;00FE;
00DF;LATIN SMALL LETTER SHARP S;Ll;0;L;;;;;N;;;;;
00E0;LATIN SMALL LETTER A WITH GRAVE;Ll;0;L;0061 0300;;;;N;LATIN SMALL LETTER A GRAVE;;00C0;;00C0
00E1;LATIN SMALL LETTER A WITH ACUTE;Ll;0;L;0061 0301;;;;N;LATIN SMALL LETTER A ACUTE;;00C1;;00C1
00E2;LATIN SMALL LETTER A WITH CIRCUMFLEX;Ll;0;L;0061 0302;;;;N;LATIN SMALL LETTER A CIRCUMFLEX;;00C2;;00C2
00E3;LATIN SMALL LETTER A WITH TILDE;Ll;0;L;0061 0303;;;;N;LATIN SMALL LETTER A TILDE;;00C3;;00C3
00E4;LATIN SMALL LETTER A WITH DIAERESIS;Ll;0;L;0061 0308;;;;N;LATIN SMALL LETTER A DIAERESIS;;00C4;;00C4
00E5;LATIN SMALL LETTER A WITH RING ABOVE;Ll;0;L;0061 030A;;;;N;LATIN SMALL LETTER A RING;;00C5;;00C5
00E6;LATIN SMALL LETTER AE;Ll;0;L;;;;;N;LATIN SMALL LETTER A E;;00C6;;00C6
00E7;LATIN SMALL LETTER C WITH CEDILLA;Ll;0;L;0063 0327;;;;N;LATIN SMALL LETTER C CEDILLA;;00C7;;00C7
00E8;LATIN SMALL LETTER E WITH GRAVE;Ll;0;L;0065 0300;;;;N;LATIN SMALL LETTER E GRAVE;;00C8;;00C8
00E9;LATIN SMALL LETTER E WITH ACUTE;Ll;0;L;0065 0301;;;;N;LATIN SMALL LETTER E ACUTE;;00C9;;00C9
00EA;LATIN SMALL LETTER E WITH CIRCUMFLEX;Ll;0;L;0065 0302;;;;N;LATIN SMALL LETTER E CIRCUMFLEX;;00CA;;00CA
00EB;LATIN SMALL LETTER E WITH DIAERESIS;Ll;0;L;0065 0308;;;;N;LATIN SMALL LETTER E DIAERESIS;;00CB;;00CB
00EC;LATIN SMALL LETTER I WITH GRAVE;Ll;0;L;0069 0300;;;;N;LATIN SMALL LETTER I GRAVE;;00CC;;00CC
00ED;LATIN SMALL LETTER I WITH ACUTE;Ll;0;L;0069 0301;;;;N;LATIN SMALL LETTER I ACUTE;;00CD;;00CD
00EE;LATIN SMALL LETTER I WITH CIRCUMFLEX;Ll;0;L;0069 0302;;;;N;LATIN SMALL LETTER I CIRCUMFLEX;;00CE;;00CE
00EF;LATIN SMALL LETTER I WITH DIAERESIS;Ll;0;L;0069 0308;;;;N;LATIN SMALL LETTER I DIAERESIS;;00CF;;00CF
00F0;LATIN SMALL LETTER ETH;Ll;0;L;;;;;N;;;00D0;;00D0
00F1;LATIN SMALL LETTER N WITH TILDE;Ll;0;L;006E 0303;;;;N;LATIN SMALL LETTER N TILDE;;00D1;;00D1
00F2;LATIN SMALL LETTER O WITH GRAVE;Ll;0;L;006F 0300;;;;N;LATIN SMALL LETTER O GRAVE;;00D2;;00D2
00F3;LATIN SMALL LETTER O WITH ACUTE;Ll;0;L;006F 0301;;;;N;LATIN SMALL LETTER O ACUTE;;00D3;;00D3
00F4;LATIN SMALL LETTER O WITH CIRCUMFLEX;Ll;0;L;006F 0302;;;;N;LATIN SMALL LETTER O CIRCUMFLEX;;00D4;;00D4
00F5;LATIN SMALL LETTER O WITH TILDE;Ll;0;L;006F 0303;;;;N;LATIN SMALL LETTER O TILDE;;00D5;;00D5
00F6;LATIN SMALL LETTER O WITH DIAERESIS;Ll;0;L;006F 0308;;;;N;LATIN SMALL LETTER O DIAERESIS;;00D6;;00D6
00F7;DIVISION SIGN;Sm;0;ON;;;;;N;;;;;
00F8;LATIN SMALL LETTER O WITH STROKE;Ll;0;L;;;;;N;LATIN SMALL LETTER O SLASH;;00D8;;00D8
00F9;LATIN SMALL LETTER U WITH GRAVE;Ll;0;L;0075 0300;;;;N;LATIN SMALL LETTER U GRAVE;;00D9;;00D9
00FA;LATIN SMALL LETTER U WITH ACUTE;Ll;0;L;0075 0301;;;;N;LATIN SMALL LETTER U ACUTE;;00DA;;00DA
00FB;LATIN SMALL LETTER U WITH CIRCUMFLEX;Ll;0;L;0075 0302;;;;N;LATIN SMALL LETTER U CIRCUMFLEX;;00DB;;00DB
00FC;LATIN SMALL LETTER U WITH DIAERESIS;Ll;0;L;0075 0308;;;;N;LATIN SMALL LETTER U DIAERESIS;;00DC;;00DC
00FD;LATIN SMALL LETTER Y WITH ACUTE;Ll;0;L;0079 0301;;;;N;LATIN SMALL LETTER Y ACUTE;;00DD;;00DD
00FE;LATIN SMALL LETTER THORN;Ll;0;L;;;;;N;;;00DE;;00DE
00FF;LATIN SMALL LETTER Y WITH DIAERESIS;Ll;0;L;0079 0308;;;;N;LATIN SMALL LETTER Y DIAERESIS;;0178;;0178

2
src/duk_bi_date.c

@ -61,7 +61,7 @@ DUK_LOCAL_DECL duk_ret_t duk__set_this_timeval_from_dparts(duk_context *ctx, duk
#define DUK__YEAR(x) ((duk_uint8_t) ((x) - 1970)) #define DUK__YEAR(x) ((duk_uint8_t) ((x) - 1970))
DUK_LOCAL duk_uint8_t duk__date_equivyear[14] = { DUK_LOCAL duk_uint8_t duk__date_equivyear[14] = {
#if 1 #if 1
/* This is based on V8 EquivalentYear() algorithm (see src/genequivyear.py): /* This is based on V8 EquivalentYear() algorithm (see util/genequivyear.py):
* http://code.google.com/p/v8/source/browse/trunk/src/date.h#146 * http://code.google.com/p/v8/source/browse/trunk/src/date.h#146
*/ */

2
src/duk_heap_hashstring.c

@ -13,7 +13,7 @@
* with real world inputs). Unless the hash is cryptographic, it's always * with real world inputs). Unless the hash is cryptographic, it's always
* possible to craft inputs with maximal hash collisions. * possible to craft inputs with maximal hash collisions.
* *
* NOTE: The hash algorithms must match src/dukutil.py:duk_heap_hashstring() * NOTE: The hash algorithms must match tools/dukutil.py:duk_heap_hashstring()
* for ROM string support! * for ROM string support!
*/ */

2
src/duk_numconv.c

@ -18,7 +18,7 @@
#define DUK__DIGITCHAR(x) duk_lc_digits[(x)] #define DUK__DIGITCHAR(x) duk_lc_digits[(x)]
/* /*
* Tables generated with src/gennumdigits.py. * Tables generated with util/gennumdigits.py.
* *
* duk__str2num_digits_for_radix indicates, for each radix, how many input * duk__str2num_digits_for_radix indicates, for each radix, how many input
* digits should be considered significant for string-to-number conversion. * digits should be considered significant for string-to-number conversion.

14
src/duk_unicode_support.c

@ -285,7 +285,7 @@ DUK_INTERNAL duk_ucodepoint_t duk_unicode_decode_xutf8_checked(duk_hthread *thr,
* chosen from several variants, based on x64 gcc -O2 testing. See: * chosen from several variants, based on x64 gcc -O2 testing. See:
* https://github.com/svaarala/duktape/pull/422 * https://github.com/svaarala/duktape/pull/422
* *
* NOTE: must match src/dukutil.py:duk_unicode_unvalidated_utf8_length(). * NOTE: must match tools/dukutil.py:duk_unicode_unvalidated_utf8_length().
*/ */
#if defined(DUK_USE_PREFER_SIZE) #if defined(DUK_USE_PREFER_SIZE)
@ -396,7 +396,7 @@ DUK_INTERNAL duk_size_t duk_unicode_unvalidated_utf8_length(const duk_uint8_t *d
* Used for slow path Unicode matching. * Used for slow path Unicode matching.
*/ */
/* Must match src/extract_chars.py, generate_match_table3(). */ /* Must match tools/extract_chars.py, generate_match_table3(). */
DUK_LOCAL duk_uint32_t duk__uni_decode_value(duk_bitdecoder_ctx *bd_ctx) { DUK_LOCAL duk_uint32_t duk__uni_decode_value(duk_bitdecoder_ctx *bd_ctx) {
duk_uint32_t t; duk_uint32_t t;
@ -467,7 +467,7 @@ DUK_INTERNAL duk_small_int_t duk_unicode_is_whitespace(duk_codepoint_t cp) {
* FEFF;ZERO WIDTH NO-BREAK SPACE;Cf;0;BN;;;;;N;BYTE ORDER MARK;;;; * FEFF;ZERO WIDTH NO-BREAK SPACE;Cf;0;BN;;;;;N;BYTE ORDER MARK;;;;
* *
* It also specifies any Unicode category 'Zs' characters as white * It also specifies any Unicode category 'Zs' characters as white
* space. These can be extracted with the "src/extract_chars.py" script. * space. These can be extracted with the "tools/extract_chars.py" script.
* Current result: * Current result:
* *
* RAW OUTPUT: * RAW OUTPUT:
@ -574,7 +574,7 @@ DUK_INTERNAL duk_small_int_t duk_unicode_is_identifier_start(duk_codepoint_t cp)
* *
* The "UnicodeLetter" alternative of the production allows letters * The "UnicodeLetter" alternative of the production allows letters
* from various Unicode categories. These can be extracted with the * from various Unicode categories. These can be extracted with the
* "src/extract_chars.py" script. * "tools/extract_chars.py" script.
* *
* Because the result has hundreds of Unicode codepoint ranges, matching * Because the result has hundreds of Unicode codepoint ranges, matching
* for any values >= 0x80 are done using a very slow range-by-range scan * for any values >= 0x80 are done using a very slow range-by-range scan
@ -671,7 +671,7 @@ DUK_INTERNAL duk_small_int_t duk_unicode_is_identifier_part(duk_codepoint_t cp)
* The matching code reuses the "identifier start" tables, and then * The matching code reuses the "identifier start" tables, and then
* consults a separate range set for characters in "identifier part" * consults a separate range set for characters in "identifier part"
* but not in "identifier start". These can be extracted with the * but not in "identifier start". These can be extracted with the
* "src/extract_chars.py" script. * "tools/extract_chars.py" script.
* *
* UnicodeCombiningMark -> categories Mn, Mc * UnicodeCombiningMark -> categories Mn, Mc
* UnicodeDigit -> categories Nd * UnicodeDigit -> categories Nd
@ -786,14 +786,14 @@ DUK_INTERNAL duk_small_int_t duk_unicode_is_letter(duk_codepoint_t cp) {
/* /*
* Complex case conversion helper which decodes a bit-packed conversion * Complex case conversion helper which decodes a bit-packed conversion
* control stream generated by unicode/extract_caseconv.py. The conversion * control stream generated by tools/extract_caseconv.py. The conversion
* is very slow because it runs through the conversion data in a linear * is very slow because it runs through the conversion data in a linear
* fashion to save space (which is why ASCII characters have a special * fashion to save space (which is why ASCII characters have a special
* fast path before arriving here). * fast path before arriving here).
* *
* The particular bit counts etc have been determined experimentally to * The particular bit counts etc have been determined experimentally to
* be small but still sufficient, and must match the Python script * be small but still sufficient, and must match the Python script
* (src/extract_caseconv.py). * (tools/extract_caseconv.py).
* *
* The return value is the case converted codepoint or -1 if the conversion * The return value is the case converted codepoint or -1 if the conversion
* results in multiple characters (this is useful for regexp Canonicalization * results in multiple characters (this is useful for regexp Canonicalization

4
src/duk_unicode_tables.c

@ -13,7 +13,7 @@
* compactness is most important. * compactness is most important.
* *
* The tables are matched using uni_range_match() and the format * The tables are matched using uni_range_match() and the format
* is described in src/extract_chars.py. * is described in tools/extract_chars.py.
*/ */
#ifdef DUK_USE_SOURCE_NONBMP #ifdef DUK_USE_SOURCE_NONBMP
@ -47,7 +47,7 @@
#endif #endif
/* /*
* Case conversion tables generated using src/extract_caseconv.py. * Case conversion tables generated using tools/extract_caseconv.py.
*/ */
/* duk_unicode_caseconv_uc[] */ /* duk_unicode_caseconv_uc[] */

45
testrunner/client-simple-node/run_commit_test.py

@ -309,18 +309,23 @@ def context_helper_minsize_fltoetc(archopt, strip):
cwd = os.getcwd() cwd = os.getcwd()
def comp(): def comp():
execute([ 'make', 'dist' ]) execute([ 'make', 'dist' ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-noline') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-separate') ])
cmd = [ cmd = [
'python2', os.path.join(cwd, 'tools', 'genconfig.py'), 'python2', os.path.join(cwd, 'dist', 'tools', 'prepare_sources.py'),
'--metadata', os.path.join(cwd, 'config'), '--source-directory', os.path.join(cwd, 'dist', 'src-input'),
'--output', os.path.join(cwd, 'dist', 'src', 'duk_config.h'), '--output-directory', os.path.join(cwd, 'dist'),
'--config-metadata', os.path.join(cwd, 'dist', 'config', 'genconfig_metadata.tar.gz'),
'--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory.yaml') '--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory.yaml')
] ]
if strip: if strip:
cmd += [ cmd += [
'--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory_strip.yaml') '--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory_strip.yaml'),
] '--unicode-data', os.path.join(cwd, 'dist', 'src-input', 'UnicodeData-8bit.txt'),
cmd += [ '--special-casing', os.path.join(cwd, 'dist', 'src-input', 'SpecialCasing-8bit.txt')
'duk-config-header'
] ]
execute(cmd) execute(cmd)
execute([ execute([
@ -645,12 +650,21 @@ def context_helper_hello_ram(archopt):
def test(genconfig_opts): def test(genconfig_opts):
os.chdir(cwd) os.chdir(cwd)
execute([ 'make', 'clean' ]) execute([ 'make', 'clean', 'dist' ])
execute([
'python2', os.path.join(cwd, 'util', 'make_dist.py'), execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-noline') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-separate') ])
cmd = [
'python2', os.path.join(cwd, 'dist', 'tools', 'prepare_sources.py'),
'--source-directory', os.path.join(cwd, 'dist', 'src-input'),
'--output-directory', os.path.join(cwd, 'dist'),
'--config-metadata', os.path.join(cwd, 'dist', 'config', 'genconfig_metadata.tar.gz'),
'--rom-support' '--rom-support'
]) ] + genconfig_opts
genconfig_dist_src(genconfig_opts) print(repr(cmd))
execute(cmd)
execute([ execute([
'gcc', '-ohello', archopt, 'gcc', '-ohello', archopt,
'-Os', '-fomit-frame-pointer', '-Os', '-fomit-frame-pointer',
@ -943,6 +957,13 @@ def main():
os.chdir(temp_dir) os.chdir(temp_dir)
os.mkdir(repo_dir) os.mkdir(repo_dir)
print('')
print('*** GCC and Clang versions')
print('')
execute([ 'gcc', '-v' ], catch=True)
execute([ 'clang', '-v' ], catch=True)
print('') print('')
print('*** Unpack repos and helpers') print('*** Unpack repos and helpers')
print('') print('')

11
tools/combine_src.py

@ -5,7 +5,7 @@
# Overview of the process: # Overview of the process:
# #
# * Parse user supplied C files. Add automatic #undefs at the end # * Parse user supplied C files. Add automatic #undefs at the end
# of each C file to avoid defined bleeding from one file to another. # of each C file to avoid defines bleeding from one file to another.
# #
# * Combine the C files in specified order. If sources have ordering # * Combine the C files in specified order. If sources have ordering
# dependencies (depends on application), order may matter. # dependencies (depends on application), order may matter.
@ -14,11 +14,11 @@
# them either as "internal" (found in specified include path) or # them either as "internal" (found in specified include path) or
# "external". Internal includes, unless explicitly excluded, are # "external". Internal includes, unless explicitly excluded, are
# inlined into the result while extenal includes are left as is. # inlined into the result while extenal includes are left as is.
# Duplicate #include statements are replaced with a comment. # Duplicate internal #include statements are replaced with a comment.
# #
# At every step, source and header lines are represented with explicit # At every step, source and header lines are represented with explicit
# line objects which keep track of original filename and line. The # line objects which keep track of original filename and line. The
# output contains #line directives, if necessary, to ensure error # output contains #line directives, if requested, to ensure error
# throwing and other diagnostic info will work in a useful manner when # throwing and other diagnostic info will work in a useful manner when
# deployed. It's also possible to generate a combined source with no # deployed. It's also possible to generate a combined source with no
# #line directives. # #line directives.
@ -234,7 +234,7 @@ def main():
assert(opts.output_source) assert(opts.output_source)
assert(opts.output_metadata) assert(opts.output_metadata)
print('Read input files, add automatic #undefs') # Read input files, add automatic #undefs
sources = args sources = args
files = [] files = []
for fn in sources: for fn in sources:
@ -243,7 +243,6 @@ def main():
addAutomaticUndefs(res) addAutomaticUndefs(res)
files.append(res) files.append(res)
print('Create combined source file from %d source files' % len(files))
combined_source, metadata = \ combined_source, metadata = \
createCombined(files, opts.prologue, opts.line_directives) createCombined(files, opts.prologue, opts.line_directives)
with open(opts.output_source, 'wb') as f: with open(opts.output_source, 'wb') as f:
@ -251,7 +250,7 @@ def main():
with open(opts.output_metadata, 'wb') as f: with open(opts.output_metadata, 'wb') as f:
f.write(json.dumps(metadata, indent=4)) f.write(json.dumps(metadata, indent=4))
print('Wrote %d bytes to %s' % (len(combined_source), opts.output_source)) print('Combined %d source files, %d bytes written to %s' % (len(files), len(combined_source), opts.output_source))
if __name__ == '__main__': if __name__ == '__main__':
main() main()

26
tools/extract_caseconv.py

@ -12,14 +12,16 @@
# and String.prototype.toLocaleLowerCase()), so they are best handled # and String.prototype.toLocaleLowerCase()), so they are best handled
# in C anyway. # in C anyway.
# #
# Case conversion rules for ASCII are also excluded as they are # Case conversion rules for ASCII are also excluded as they are handled
# handled by C fast path. Rules for non-BMP characters (codepoints # by the C fast path. Rules for non-BMP characters (codepoints above
# above U+FFFF) are omitted as they're not required for standard # U+FFFF) are omitted as they're not required for standard Ecmascript.
# Ecmascript.
# #
import os, sys, math import os
import sys
import math
import optparse import optparse
import dukutil import dukutil
class UnicodeData: class UnicodeData:
@ -215,7 +217,7 @@ def generate_tables(convmap):
ranges = [] # range mappings (2 or more consecutive mappings with a certain skip) ranges = [] # range mappings (2 or more consecutive mappings with a certain skip)
singles = [] # 1:1 character mappings singles = [] # 1:1 character mappings
complex = [] # 1:n character mappings multis = [] # 1:n character mappings
# Ranges with skips # Ranges with skips
@ -267,25 +269,25 @@ def generate_tables(convmap):
# print 'special399, skip %d: %d %d %d' % (skip, start_i, start_o, count) # print 'special399, skip %d: %d %d %d' % (skip, start_i, start_o, count)
# print len(tmp.keys()) # print len(tmp.keys())
# print repr(tmp) # print repr(tmp)
# XXX: need to put 12 remaining mappings back to convmap... # XXX: need to put 12 remaining mappings back to convmap
# 1:n conversions # 1:n conversions
k = convmap.keys() k = convmap.keys()
k.sort() k.sort()
for i in k: for i in k:
complex.append([i, convmap[i]]) # codepoint, string multis.append([i, convmap[i]]) # codepoint, string
del convmap[i] del convmap[i]
for t in singles: for t in singles:
print repr(t) print repr(t)
for t in complex: for t in multis:
print repr(t) print repr(t)
print 'range mappings: %d' % len(ranges) print 'range mappings: %d' % len(ranges)
print 'single character mappings: %d' % len(singles) print 'single character mappings: %d' % len(singles)
print 'complex mappings (1:n): %d' % len(complex) print 'complex mappings (1:n): %d' % len(multis)
print 'remaining (should be zero): %d' % len(convmap.keys()) print 'remaining (should be zero): %d' % len(convmap.keys())
# XXX: opportunities for diff encoding skip=3 ranges? # XXX: opportunities for diff encoding skip=3 ranges?
@ -330,9 +332,9 @@ def generate_tables(convmap):
be.bits(cp_i, 16) be.bits(cp_i, 16)
be.bits(cp_o, 16) be.bits(cp_o, 16)
count = len(complex) count = len(multis)
be.bits(count, 7) be.bits(count, 7)
for t in complex: for t in multis:
cp_i, str_o = t[0], t[1] cp_i, str_o = t[0], t[1]
be.bits(cp_i, 16) be.bits(cp_i, 16)
be.bits(len(str_o), 2) be.bits(len(str_o), 2)

13
tools/extract_chars.py

@ -15,8 +15,11 @@
# supported in standard Ecmascript. # supported in standard Ecmascript.
# #
import os, sys, math import os
import sys
import math
import optparse import optparse
import dukutil import dukutil
def read_unicode_data(unidata, catsinc, catsexc, filterfunc): def read_unicode_data(unidata, catsinc, catsexc, filterfunc):
@ -318,19 +321,19 @@ def main():
return True return True
print('read unicode data') print('read unicode data')
res = read_unicode_data(unidata, catsinc, catsexc, filter1) uni_filtered = read_unicode_data(unidata, catsinc, catsexc, filter1)
print('done reading unicode data') print('done reading unicode data')
# Raw output # Raw output
#print('RAW OUTPUT:') #print('RAW OUTPUT:')
#print('===========') #print('===========')
#print('\n'.join(res)) #print('\n'.join(uni_filtered))
# Scan ranges # Scan ranges
#print('') #print('')
#print('RANGES:') #print('RANGES:')
#print('=======') #print('=======')
ranges = scan_ranges(res) ranges = scan_ranges(uni_filtered)
#for i in ranges: #for i in ranges:
# if i[0] == i[1]: # if i[0] == i[1]:
# print('0x%04x' % i[0]) # print('0x%04x' % i[0])
@ -376,7 +379,7 @@ def main():
# Image (for illustrative purposes only) # Image (for illustrative purposes only)
if opts.out_png is not None: if opts.out_png is not None:
generate_png(res, opts.out_png) generate_png(uni_filtered, opts.out_png)
if __name__ == '__main__': if __name__ == '__main__':
main() main()

44
tools/genbuildparams.py

@ -1,44 +0,0 @@
#!/usr/bin/env python2
#
# Generate build parameter files based on build information.
# A C header is generated for C code, and a JSON file for
# build scripts etc which need to know the build config.
#
import os
import sys
import json
import optparse
import dukutil
if __name__ == '__main__':
parser = optparse.OptionParser()
parser.add_option('--version', dest='version')
parser.add_option('--git-commit', dest='git_commit')
parser.add_option('--git-describe', dest='git_describe')
parser.add_option('--git-branch', dest='git_branch')
parser.add_option('--out-json', dest='out_json')
parser.add_option('--out-header', dest='out_header')
(opts, args) = parser.parse_args()
t = {
'version': opts.version,
'git_commit': opts.git_commit,
'git_describe': opts.git_describe,
'git_branch': opts.git_branch,
}
f = open(opts.out_json, 'wb')
f.write(dukutil.json_encode(t).encode('ascii'))
f.close()
f = open(opts.out_header, 'wb')
f.write('#ifndef DUK_BUILDPARAMS_H_INCLUDED\n')
f.write('#define DUK_BUILDPARAMS_H_INCLUDED\n')
f.write('/* automatically generated by genbuildparams.py, do not edit */\n')
f.write('\n')
f.write('/* DUK_VERSION is defined in duktape.h */')
f.write('\n')
f.write('#endif /* DUK_BUILDPARAMS_H_INCLUDED */\n')
f.close()

32
tools/genbuiltins.py

@ -559,6 +559,7 @@ def metadata_normalize_rom_property_attributes(meta):
# Add a 'name' property for all top level functions; expected by RAM # Add a 'name' property for all top level functions; expected by RAM
# initialization code. # initialization code.
def metadata_normalize_ram_function_names(meta): def metadata_normalize_ram_function_names(meta):
num_added = 0
for o in meta['objects']: for o in meta['objects']:
if not o.get('callable', False): if not o.get('callable', False):
continue continue
@ -568,9 +569,13 @@ def metadata_normalize_ram_function_names(meta):
name_prop = p name_prop = p
break break
if name_prop is None: if name_prop is None:
print('Adding missing "name" property for top level function %s' % o['id']) num_added += 1
#print('Adding missing "name" property for function %s' % o['id'])
o['properties'].append({ 'key': 'name', 'value': '', 'attributes': '' }) o['properties'].append({ 'key': 'name', 'value': '', 'attributes': '' })
if num_added > 0:
print('Added missing "name" property for %d functions' % num_added)
# Add a built-in objects list for RAM initialization. # Add a built-in objects list for RAM initialization.
def metadata_add_ram_filtered_object_list(meta): def metadata_add_ram_filtered_object_list(meta):
# For RAM init data to support user objects, we need to prepare a # For RAM init data to support user objects, we need to prepare a
@ -752,8 +757,8 @@ def metadata_remove_orphan_objects(meta):
_markId(v.get('getter_id')) _markId(v.get('getter_id'))
_markId(v.get('setter_id')) _markId(v.get('setter_id'))
print('Mark reachable: reachable count initially %d, now %d' % \ #print('Mark reachable: reachable count initially %d, now %d' % \
(reachable_count, len(reachable.keys()))) # (reachable_count, len(reachable.keys())))
if reachable_count == len(reachable.keys()): if reachable_count == len(reachable.keys()):
break break
@ -769,6 +774,7 @@ def metadata_remove_orphan_objects(meta):
num_deleted += 1 num_deleted += 1
break break
if num_deleted > 0:
print('Deleted %d unreachable objects' % num_deleted) print('Deleted %d unreachable objects' % num_deleted)
# Add C define names for builtin strings. These defines are added to all # Add C define names for builtin strings. These defines are added to all
@ -969,7 +975,7 @@ def load_metadata(opts, rom=False, build_info=None):
# Add Duktape.version and (Duktape.env for ROM case). # Add Duktape.version and (Duktape.env for ROM case).
for o in meta['objects']: for o in meta['objects']:
if o['id'] == 'bi_duktape': if o['id'] == 'bi_duktape':
o['properties'].insert(0, { 'key': 'version', 'value': int(build_info['version']), 'attributes': '' }) o['properties'].insert(0, { 'key': 'version', 'value': int(build_info['duk_version']), 'attributes': '' })
if rom: if rom:
# Use a fixed (quite dummy for now) Duktape.env # Use a fixed (quite dummy for now) Duktape.env
# when ROM builtins are in use. In the RAM case # when ROM builtins are in use. In the RAM case
@ -2811,7 +2817,10 @@ def emit_header_native_function_declarations(genc, meta):
def main(): def main():
parser = optparse.OptionParser() parser = optparse.OptionParser()
parser.add_option('--buildinfo', dest='buildinfo', help='Build info, JSON format') parser.add_option('--git-commit', dest='git_commit', default=None, help='Git commit hash')
parser.add_option('--git-describe', dest='git_describe', default=None, help='Git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Git branch name')
parser.add_option('--duk-version', dest='duk_version', default=None, help='Duktape version (e.g. 10203)')
parser.add_option('--used-stridx-metadata', dest='used_stridx_metadata', help='DUK_STRIDX_xxx used by source/headers, JSON format') parser.add_option('--used-stridx-metadata', dest='used_stridx_metadata', help='DUK_STRIDX_xxx used by source/headers, JSON format')
parser.add_option('--strings-metadata', dest='strings_metadata', help='Built-in strings metadata file, YAML format') parser.add_option('--strings-metadata', dest='strings_metadata', help='Built-in strings metadata file, YAML format')
parser.add_option('--objects-metadata', dest='objects_metadata', help='Built-in objects metadata file, YAML format') parser.add_option('--objects-metadata', dest='objects_metadata', help='Built-in objects metadata file, YAML format')
@ -2828,11 +2837,12 @@ def main():
# Options processing. # Options processing.
if opts.buildinfo is None: build_info = {
raise Exception('missing buildinfo') 'git_commit': opts.git_commit,
'git_branch': opts.git_branch,
with open(opts.buildinfo, 'rb') as f: 'git_describe': opts.git_describe,
build_info = dukutil.json_decode(f.read().strip()) 'duk_version': int(opts.duk_version),
}
# Read in metadata files, normalizing and merging as necessary. # Read in metadata files, normalizing and merging as necessary.
@ -2954,7 +2964,7 @@ def main():
# Write a JSON file with build metadata, e.g. built-in strings. # Write a JSON file with build metadata, e.g. built-in strings.
ver = long(build_info['version']) ver = long(build_info['duk_version'])
plain_strs = [] plain_strs = []
base64_strs = [] base64_strs = []
str_objs = [] str_objs = []

63
tools/genconfig.py

@ -1398,7 +1398,7 @@ def generate_duk_config_header(opts, meta_dir):
# Main # Main
# #
def main(): def add_genconfig_optparse_options(parser, direct=False):
# Forced options from multiple sources are gathered into a shared list # Forced options from multiple sources are gathered into a shared list
# so that the override order remains the same as on the command line. # so that the override order remains the same as on the command line.
force_options_yaml = [] force_options_yaml = []
@ -1436,19 +1436,14 @@ def main():
line = line[:-1] line = line[:-1]
fixup_header_lines.append(line) fixup_header_lines.append(line)
commands = [ if direct:
'duk-config-header', parser.add_option('--metadata', dest='config_metadata', default=None, help='metadata directory or metadata tar.gz file')
'feature-documentation',
'config-documentation'
]
parser = optparse.OptionParser(
usage='Usage: %prog [options] COMMAND',
description='Generate a duk_config.h or config option documentation based on config metadata.',
epilog='COMMAND can be one of: ' + ', '.join(commands) + '.'
)
parser.add_option('--metadata', dest='metadata', default=None, help='metadata directory or metadata tar.gz file')
parser.add_option('--output', dest='output', default=None, help='output filename for C header or RST documentation file') parser.add_option('--output', dest='output', default=None, help='output filename for C header or RST documentation file')
else:
# Different option name when called through prepare_sources.py,
# also no --output option.
parser.add_option('--config-metadata', dest='config_metadata', default=None, help='metadata directory or metadata tar.gz file')
parser.add_option('--platform', dest='platform', default=None, help='platform (default is autodetect)') parser.add_option('--platform', dest='platform', default=None, help='platform (default is autodetect)')
parser.add_option('--compiler', dest='compiler', default=None, help='compiler (default is autodetect)') parser.add_option('--compiler', dest='compiler', default=None, help='compiler (default is autodetect)')
parser.add_option('--architecture', dest='architecture', default=None, help='architecture (default is autodetec)') parser.add_option('--architecture', dest='architecture', default=None, help='architecture (default is autodetec)')
@ -1471,26 +1466,46 @@ def main():
parser.add_option('--fixup-line', type='string', dest='fixup_header_lines', action='callback', callback=add_fixup_header_line, default=fixup_header_lines, help='C header fixup line to be appended to generated header (e.g. --fixup-line "#define DUK_USE_FASTINT")') parser.add_option('--fixup-line', type='string', dest='fixup_header_lines', action='callback', callback=add_fixup_header_line, default=fixup_header_lines, help='C header fixup line to be appended to generated header (e.g. --fixup-line "#define DUK_USE_FASTINT")')
parser.add_option('--sanity-warning', dest='sanity_strict', action='store_false', default=True, help='emit a warning instead of #error for option sanity check issues') parser.add_option('--sanity-warning', dest='sanity_strict', action='store_false', default=True, help='emit a warning instead of #error for option sanity check issues')
parser.add_option('--use-cpp-warning', dest='use_cpp_warning', action='store_true', default=False, help='emit a (non-portable) #warning when appropriate') parser.add_option('--use-cpp-warning', dest='use_cpp_warning', action='store_true', default=False, help='emit a (non-portable) #warning when appropriate')
if direct:
parser.add_option('--git-commit', dest='git_commit', default=None, help='git commit hash to be included in header comments') parser.add_option('--git-commit', dest='git_commit', default=None, help='git commit hash to be included in header comments')
parser.add_option('--git-describe', dest='git_describe', default=None, help='git describe string to be included in header comments') parser.add_option('--git-describe', dest='git_describe', default=None, help='git describe string to be included in header comments')
parser.add_option('--git-branch', dest='git_branch', default=None, help='git branch string to be included in header comments') parser.add_option('--git-branch', dest='git_branch', default=None, help='git branch string to be included in header comments')
def parse_options():
commands = [
'duk-config-header',
'feature-documentation',
'config-documentation'
]
parser = optparse.OptionParser(
usage='Usage: %prog [options] COMMAND',
description='Generate a duk_config.h or config option documentation based on config metadata.',
epilog='COMMAND can be one of: ' + ', '.join(commands) + '.'
)
add_genconfig_optparse_options(parser, direct=True)
(opts, args) = parser.parse_args() (opts, args) = parser.parse_args()
meta_dir = opts.metadata return opts, args
if opts.metadata is None:
def genconfig(opts, args):
meta_dir = opts.config_metadata
if opts.config_metadata is None:
if os.path.isfile(os.path.join('.', 'genconfig_metadata.tar.gz')): if os.path.isfile(os.path.join('.', 'genconfig_metadata.tar.gz')):
opts.metadata = 'genconfig_metadata.tar.gz' opts.config_metadata = 'genconfig_metadata.tar.gz'
elif os.path.isdir(os.path.join('.', 'config-options')): elif os.path.isdir(os.path.join('.', 'config-options')):
opts.metadata = '.' opts.config_metadata = '.'
if opts.metadata is not None and os.path.isdir(opts.metadata): if opts.config_metadata is not None and os.path.isdir(opts.config_metadata):
meta_dir = opts.metadata meta_dir = opts.config_metadata
metadata_src_text = 'Using metadata directory: %r' % meta_dir metadata_src_text = 'Using metadata directory: %r' % meta_dir
elif opts.metadata is not None and os.path.isfile(opts.metadata) and tarfile.is_tarfile(opts.metadata): elif opts.config_metadata is not None and os.path.isfile(opts.config_metadata) and tarfile.is_tarfile(opts.config_metadata):
meta_dir = get_auto_delete_tempdir() meta_dir = get_auto_delete_tempdir()
tar = tarfile.open(name=opts.metadata, mode='r:*') tar = tarfile.open(name=opts.config_metadata, mode='r:*')
tar.extractall(path=meta_dir) tar.extractall(path=meta_dir)
metadata_src_text = 'Using metadata tar file %r, unpacked to directory: %r' % (opts.metadata, meta_dir) metadata_src_text = 'Using metadata tar file %r, unpacked to directory: %r' % (opts.config_metadata, meta_dir)
else: else:
raise Exception('metadata source must be a directory or a tar.gz file') raise Exception('metadata source must be a directory or a tar.gz file')
@ -1526,5 +1541,9 @@ def main():
else: else:
raise Exception('invalid command: %r' % cmd) raise Exception('invalid command: %r' % cmd)
def main():
opts, args = parse_options()
genconfig(opts, args)
if __name__ == '__main__': if __name__ == '__main__':
main() main()

112
tools/prepare_sources.py

@ -15,6 +15,8 @@ import json
import yaml import yaml
import subprocess import subprocess
import genconfig
# Helpers # Helpers
def exec_get_stdout(cmd, input=None, default=None, print_stdout=False): def exec_get_stdout(cmd, input=None, default=None, print_stdout=False):
@ -214,29 +216,11 @@ def main():
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help='User strings and objects to add, YAML format (can be repeated for multiple overrides)') parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help='User strings and objects to add, YAML format (can be repeated for multiple overrides)')
# Options forwarded to genconfig.py. # Options forwarded to genconfig.py.
parser.add_option('--config-metadata', dest='config_metadata', default=None, help='metadata directory or metadata tar.gz file') genconfig.add_genconfig_optparse_options(parser)
parser.add_option('--platform', dest='platform', default=None, help='platform (default is autodetect)')
parser.add_option('--compiler', dest='compiler', default=None, help='compiler (default is autodetect)') # Options for Unicode.
parser.add_option('--architecture', dest='architecture', default=None, help='architecture (default is autodetec)') parser.add_option('--unicode-data', dest='unicode_data', default=None, help='Provide custom UnicodeData.txt')
parser.add_option('--c99-types-only', dest='c99_types_only', action='store_true', default=False, help='assume C99 types, no legacy type detection') parser.add_option('--special-casing', dest='special_casing', default=None, help='Provide custom SpecialCasing.txt')
parser.add_option('--dll', dest='dll', action='store_true', default=False, help='dll build of Duktape, affects symbol visibility macros especially on Windows')
parser.add_option('--support-feature-options', dest='support_feature_options', action='store_true', default=False, help='support DUK_OPT_xxx feature options in duk_config.h')
parser.add_option('--emit-legacy-feature-check', dest='emit_legacy_feature_check', action='store_true', default=False, help='emit preprocessor checks to reject legacy feature options (DUK_OPT_xxx)')
parser.add_option('--emit-config-sanity-check', dest='emit_config_sanity_check', action='store_true', default=False, help='emit preprocessor checks for config option consistency (DUK_OPT_xxx)')
parser.add_option('--omit-removed-config-options', dest='omit_removed_config_options', action='store_true', default=False, help='omit removed config options from generated headers')
parser.add_option('--omit-deprecated-config-options', dest='omit_deprecated_config_options', action='store_true', default=False, help='omit deprecated config options from generated headers')
parser.add_option('--omit-unused-config-options', dest='omit_unused_config_options', action='store_true', default=False, help='omit unused config options from generated headers')
parser.add_option('--add-active-defines-macro', dest='add_active_defines_macro', action='store_true', default=False, help='add DUK_ACTIVE_DEFINES macro, for development only')
parser.add_option('--define', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_define, default=force_options_yaml, help='force #define option using a C compiler like syntax, e.g. "--define DUK_USE_DEEP_C_STACK" or "--define DUK_USE_TRACEBACK_DEPTH=10"')
parser.add_option('-D', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_define, default=force_options_yaml, help='synonym for --define, e.g. "-DDUK_USE_DEEP_C_STACK" or "-DDUK_USE_TRACEBACK_DEPTH=10"')
parser.add_option('--undefine', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_undefine, default=force_options_yaml, help='force #undef option using a C compiler like syntax, e.g. "--undefine DUK_USE_DEEP_C_STACK"')
parser.add_option('-U', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_undefine, default=force_options_yaml, help='synonym for --undefine, e.g. "-UDUK_USE_DEEP_C_STACK"')
parser.add_option('--option-yaml', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_yaml, default=force_options_yaml, help='force option(s) using inline YAML (e.g. --option-yaml "DUK_USE_DEEP_C_STACK: true")')
parser.add_option('--option-file', type='string', dest='force_options_yaml', action='callback', callback=add_force_option_file, default=force_options_yaml, help='YAML file(s) providing config option overrides')
parser.add_option('--fixup-file', type='string', dest='fixup_header_lines', action='callback', callback=add_fixup_header_file, default=fixup_header_lines, help='C header snippet file(s) to be appended to generated header, useful for manual option fixups')
parser.add_option('--fixup-line', type='string', dest='fixup_header_lines', action='callback', callback=add_fixup_header_line, default=fixup_header_lines, help='C header fixup line to be appended to generated header (e.g. --fixup-line "#define DUK_USE_FASTINT")')
parser.add_option('--sanity-warning', dest='sanity_strict', action='store_false', default=True, help='emit a warning instead of #error for option sanity check issues')
parser.add_option('--use-cpp-warning', dest='use_cpp_warning', action='store_true', default=False, help='emit a (non-portable) #warning when appropriate')
(opts, args) = parser.parse_args() (opts, args) = parser.parse_args()
@ -290,6 +274,15 @@ def main():
git_describe_cstring = cstring(git_describe) git_describe_cstring = cstring(git_describe)
git_branch_cstring = cstring(git_branch) git_branch_cstring = cstring(git_branch)
if opts.unicode_data is None:
unicode_data = os.path.join(srcdir, 'UnicodeData.txt')
else:
unicode_data = opts.unicode_data
if opts.special_casing is None:
special_casing = os.path.join(srcdir, 'SpecialCasing.txt')
else:
special_casing = opts.special_casing
print('Config-and-prepare for Duktape version %s, commit %s, describe %s, branch %s' % \ print('Config-and-prepare for Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch)) (duk_version_formatted, git_commit, git_describe, git_branch))
@ -431,7 +424,8 @@ def main():
copy_and_cquote('AUTHORS.rst', os.path.join(outdir, 'AUTHORS.rst.tmp')) copy_and_cquote('AUTHORS.rst', os.path.join(outdir, 'AUTHORS.rst.tmp'))
# Create a duk_config.h. # Create a duk_config.h.
# XXX: might be easier to invoke genconfig directly # XXX: might be easier to invoke genconfig directly, but there are a few
# options which currently conflict (output file, git commit info, etc).
def forward_genconfig_options(): def forward_genconfig_options():
res = [] res = []
res += [ '--metadata', os.path.abspath(opts.config_metadata) ] # rename option, --config-metadata => --metadata res += [ '--metadata', os.path.abspath(opts.config_metadata) ] # rename option, --config-metadata => --metadata
@ -459,10 +453,10 @@ def main():
res += [ '--omit-unused-config-options' ] res += [ '--omit-unused-config-options' ]
if opts.add_active_defines_macro: if opts.add_active_defines_macro:
res += [ '--add-active-defines-macro' ] res += [ '--add-active-defines-macro' ]
for i in force_options_yaml: for i in opts.force_options_yaml:
res += [ '--option-yaml', i ] res += [ '--option-yaml', i ]
for i in fixup_header_lines: for i in opts.fixup_header_lines:
res += [ '--fixup-linu', i ] res += [ '--fixup-line', i ]
if not opts.sanity_strict: if not opts.sanity_strict:
res += [ '--sanity-warning' ] res += [ '--sanity-warning' ]
if opts.use_cpp_warning: if opts.use_cpp_warning:
@ -478,7 +472,7 @@ def main():
cmd += [ cmd += [
'duk-config-header' 'duk-config-header'
] ]
print(repr(cmd)) #print(repr(cmd))
exec_print_stdout(cmd) exec_print_stdout(cmd)
copy_file(os.path.join(outdir, 'duk_config.h.tmp'), os.path.join(outdir, 'src', 'duk_config.h')) copy_file(os.path.join(outdir, 'duk_config.h.tmp'), os.path.join(outdir, 'src', 'duk_config.h'))
@ -516,18 +510,7 @@ def main():
# There are currently no profile specific variants of strings/builtins, but # There are currently no profile specific variants of strings/builtins, but
# this will probably change when functions are added/removed based on profile. # this will probably change when functions are added/removed based on profile.
# XXX: nuke this util, it's pointless # XXX: call as direct python
exec_print_stdout([
sys.executable,
os.path.join('tools', 'genbuildparams.py'),
'--version=' + str(duk_version),
'--git-commit=' + git_commit,
'--git-describe=' + git_describe,
'--git-branch=' + git_branch,
'--out-json=' + os.path.join(outdir, 'src-separate', 'buildparams.json.tmp'),
'--out-header=' + os.path.join(outdir, 'src-separate', 'duk_buildparams.h.tmp')
])
res = exec_get_stdout([ res = exec_get_stdout([
sys.executable, sys.executable,
os.path.join('tools', 'scan_used_stridx_bidx.py') os.path.join('tools', 'scan_used_stridx_bidx.py')
@ -538,31 +521,40 @@ def main():
with open(os.path.join(outdir, 'duk_used_stridx_bidx_defs.json.tmp'), 'wb') as f: with open(os.path.join(outdir, 'duk_used_stridx_bidx_defs.json.tmp'), 'wb') as f:
f.write(res) f.write(res)
gb_opts = [] # XXX: call as direct python? does this need to work outside of prepare_sources.py?
gb_opts.append('--ram-support') # enable by default cmd = [
if opts.rom_support:
# ROM string/object support is not enabled by default because
# it increases the generated duktape.c considerably.
print('Enabling --rom-support for genbuiltins.py')
gb_opts.append('--rom-support')
if opts.rom_auto_lightfunc:
print('Enabling --rom-auto-lightfunc for genbuiltins.py')
gb_opts.append('--rom-auto-lightfunc')
for fn in opts.user_builtin_metadata:
print('Forwarding --user-builtin-metadata %s' % fn)
gb_opts.append('--user-builtin-metadata')
gb_opts.append(fn)
exec_print_stdout([
sys.executable, sys.executable,
os.path.join('tools', 'genbuiltins.py'), os.path.join('tools', 'genbuiltins.py'),
'--buildinfo=' + os.path.join(outdir, 'src-separate', 'buildparams.json.tmp'), ]
cmd += [
'--git-commit', git_commit,
'--git-branch', git_branch,
'--git-describe', git_describe,
'--duk-version', str(duk_version)
]
cmd += [
'--used-stridx-metadata=' + os.path.join(outdir, 'duk_used_stridx_bidx_defs.json.tmp'), '--used-stridx-metadata=' + os.path.join(outdir, 'duk_used_stridx_bidx_defs.json.tmp'),
'--strings-metadata=' + os.path.join(srcdir, 'strings.yaml'), '--strings-metadata=' + os.path.join(srcdir, 'strings.yaml'),
'--objects-metadata=' + os.path.join(srcdir, 'builtins.yaml'), '--objects-metadata=' + os.path.join(srcdir, 'builtins.yaml'),
'--out-header=' + os.path.join(outdir, 'src-separate', 'duk_builtins.h'), '--out-header=' + os.path.join(outdir, 'src-separate', 'duk_builtins.h'),
'--out-source=' + os.path.join(outdir, 'src-separate', 'duk_builtins.c'), '--out-source=' + os.path.join(outdir, 'src-separate', 'duk_builtins.c'),
'--out-metadata-json=' + os.path.join(outdir, 'duk_build_meta.json') '--out-metadata-json=' + os.path.join(outdir, 'duk_build_meta.json')
] + gb_opts) ]
cmd.append('--ram-support') # enable by default
if opts.rom_support:
# ROM string/object support is not enabled by default because
# it increases the generated duktape.c considerably.
print('Enabling --rom-support for genbuiltins.py')
cmd.append('--rom-support')
if opts.rom_auto_lightfunc:
print('Enabling --rom-auto-lightfunc for genbuiltins.py')
cmd.append('--rom-auto-lightfunc')
for fn in opts.user_builtin_metadata:
print('Forwarding --user-builtin-metadata %s' % fn)
cmd.append('--user-builtin-metadata')
cmd.append(fn)
#print(repr(cmd))
exec_print_stdout(cmd)
# Autogenerated Unicode files # Autogenerated Unicode files
# #
@ -628,7 +620,7 @@ def main():
exec_print_stdout([ exec_print_stdout([
sys.executable, sys.executable,
os.path.join('tools', 'prepare_unicode_data.py'), os.path.join('tools', 'prepare_unicode_data.py'),
os.path.join(srcdir, 'UnicodeData.txt'), unicode_data,
os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp') os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp')
]) ])
@ -654,7 +646,7 @@ def main():
os.path.join('tools', 'extract_caseconv.py'), os.path.join('tools', 'extract_caseconv.py'),
'--command=caseconv_bitpacked', '--command=caseconv_bitpacked',
'--unicode-data=' + os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp'), '--unicode-data=' + os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp'),
'--special-casing=' + os.path.join(srcdir, 'SpecialCasing.txt'), '--special-casing=' + special_casing,
'--out-source=' + os.path.join(outdir, 'src-separate', 'duk_unicode_caseconv.c.tmp'), '--out-source=' + os.path.join(outdir, 'src-separate', 'duk_unicode_caseconv.c.tmp'),
'--out-header=' + os.path.join(outdir, 'src-separate', 'duk_unicode_caseconv.h.tmp'), '--out-header=' + os.path.join(outdir, 'src-separate', 'duk_unicode_caseconv.h.tmp'),
'--table-name-lc=duk_unicode_caseconv_lc', '--table-name-lc=duk_unicode_caseconv_lc',
@ -669,7 +661,7 @@ def main():
os.path.join('tools', 'extract_caseconv.py'), os.path.join('tools', 'extract_caseconv.py'),
'--command=re_canon_lookup', '--command=re_canon_lookup',
'--unicode-data=' + os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp'), '--unicode-data=' + os.path.join(outdir, 'src-separate', 'UnicodeData-expanded.tmp'),
'--special-casing=' + os.path.join(srcdir, 'SpecialCasing.txt'), '--special-casing=' + special_casing,
'--out-source=' + os.path.join(outdir, 'src-separate', 'duk_unicode_re_canon_lookup.c.tmp'), '--out-source=' + os.path.join(outdir, 'src-separate', 'duk_unicode_re_canon_lookup.c.tmp'),
'--out-header=' + os.path.join(outdir, 'src-separate', 'duk_unicode_re_canon_lookup.h.tmp'), '--out-header=' + os.path.join(outdir, 'src-separate', 'duk_unicode_re_canon_lookup.h.tmp'),
'--table-name-re-canon-lookup=duk_unicode_re_canon_lookup' '--table-name-re-canon-lookup=duk_unicode_re_canon_lookup'

2
util/bluebird-test-shim.js

@ -7,7 +7,7 @@ var window;
(function () { (function () {
var timers = []; var timers = [];
window = Function('return this')(); window = Function('return this')(); // window <- global object
window.setTimeout = function (fn, timeout) { window.setTimeout = function (fn, timeout) {
timers.push(fn); timers.push(fn);

2
util/check_align.c

@ -2,7 +2,7 @@
* Check for alignment requirements and endianness. * Check for alignment requirements and endianness.
* *
* Called from a shell script check_align.sh to execute one test at a time. * Called from a shell script check_align.sh to execute one test at a time.
* Prohibited unaligned accesses cause a SIGBUS. * Prohibited unaligned accesses usually cause a SIGBUS.
*/ */
#include <stdio.h> #include <stdio.h>

2
util/check_code_policy.py

@ -1,6 +1,6 @@
#!/usr/bin/env python2 #!/usr/bin/env python2
# #
# Check various C source code policy rules and issue warnings for offenders # Check various source code policy rules and issue warnings for offenders.
# #
# Usage: # Usage:
# #

2
util/example_rombuild.sh

@ -10,7 +10,7 @@ make clean dist
# Prepare-and-config sources manually to enable ROM support. User builtin # Prepare-and-config sources manually to enable ROM support. User builtin
# metadata can be provided through one or more YAML files (which are applied # metadata can be provided through one or more YAML files (which are applied
# in sequence). # in sequence). Duktape configuration can be given at the same time.
rm -rf dist/src dist/src-noline dist/src-separate rm -rf dist/src dist/src-noline dist/src-separate
$PYTHON dist/tools/prepare_sources.py \ $PYTHON dist/tools/prepare_sources.py \
--source-directory dist/src-input \ --source-directory dist/src-input \

31
util/example_user_builtins1.yaml

@ -8,7 +8,7 @@
# #
# See examples below for details on how to use these. # See examples below for details on how to use these.
# #
# Note that genbuiltins.py (and prepare_sources.py) accepts multiple user # Note that genbuiltins.py and prepare_sources.py accept multiple user
# built-in YAML files, so that you can manage your custom strings and # built-in YAML files, so that you can manage your custom strings and
# objects in individual YAML files for modularity. # objects in individual YAML files for modularity.
# #
@ -48,6 +48,13 @@ add_forced_strings:
- str: "9" - str: "9"
# Non-ascii strings: encode into UTF-8, map bytes to U+0000...U+00FF. # Non-ascii strings: encode into UTF-8, map bytes to U+0000...U+00FF.
# This encoding approach ensures byte purity which is important to allow
# representing arbitrary byte strings in Duktape internal representation.
#
# For example, to encode 'ä' (U+00E4) first UTF-8 encode the codepoint
# which yields the bytes C3 A4, and then Unicode escape the bytes to get
# '\u00c3\u00a4'. The direct Unicode escape '\u00e4' *will not work*.
# Another example:
# #
# >>> u'säätö'.encode('utf-8').encode('hex') # >>> u'säätö'.encode('utf-8').encode('hex')
# '73c3a4c3a474c3b6' # '73c3a4c3a474c3b6'
@ -105,7 +112,8 @@ objects:
# Example of adding an object. Properties may reference other objects using # Example of adding an object. Properties may reference other objects using
# the 'id' strings even if the object hasn't been introduced yet (and may be # the 'id' strings even if the object hasn't been introduced yet (and may be
# introduced by a different user builtins YAML file). # introduced by a different user builtins YAML file). The ID references are
# in effect resolved only after all metadata has been loaded.
# #
# Add a built-in object with plain property values. This just creates the # Add a built-in object with plain property values. This just creates the
# built-in object; for the object to be actually useful you must use e.g. # built-in object; for the object to be actually useful you must use e.g.
@ -147,7 +155,8 @@ objects:
# #
# - The "null" type is supported for ROM init data but not for RAM # - The "null" type is supported for ROM init data but not for RAM
# init data. Null values are replaced with "undefined" for RAM # init data. Null values are replaced with "undefined" for RAM
# init data at the moment (with a build warning). # init data at the moment (with a build warning). Same applies
# to lightfunc values at present.
# #
# - One simple way of coming up with exact IEEE doubles is to use # - One simple way of coming up with exact IEEE doubles is to use
# Python: # Python:
@ -172,10 +181,10 @@ objects:
value: true value: true
attributes: wec attributes: wec
- key: "integerType" - key: "integerType"
value: 123 value: 123 # Interpreted as an IEEE double
attributes: wec attributes: wec
- key: "floatType" - key: "floatType"
value: 123.4 value: 123.4 # Interpreted as an IEEE double
attributes: wec attributes: wec
- key: "ieeeDoubleType" # Allows exact specification of IEEE doubles - key: "ieeeDoubleType" # Allows exact specification of IEEE doubles
value: value:
@ -193,13 +202,13 @@ objects:
# native: duk_bi_math_object_random # native: duk_bi_math_object_random
# nargs: 0 # nargs: 0
# varargs: false # varargs: false
# length: 13 # length: 11
# magic: 0 # magic: 0
# attributes: wec # attributes: wec
# Object IDs are only resolved when metadata loading is complete, so it's # Object IDs are only resolved when metadata loading is complete, so it's
# OK to create reference loops or refer to objects defined later (even in # OK to create reference loops or refer to objects defined later,(even in
# a separate YAML file not yet loaded). # a separate YAML file not yet loaded.
- id: bi_circular1 - id: bi_circular1
add: true add: true
@ -279,7 +288,7 @@ objects:
value: "" value: ""
- id: bi_json - id: bi_json
disable: true # disabled in metadata disable: true # disabled in metadata, so no effect unless commented out
replace: true replace: true
class: Object class: Object
@ -300,8 +309,8 @@ objects:
# custom built-ins and a target specific YAML file removes bindings not # custom built-ins and a target specific YAML file removes bindings not
# needed for a certain target. # needed for a certain target.
# #
# In this example we'd delete the StarTrek object. The global reference # In this example we'd delete the StarTrek object. The dangling global
# global.StarTrek would be deleted automatically. # reference global.StarTrek would be deleted automatically.
- id: bi_star_trek - id: bi_star_trek
disable: true # disabled in metadata disable: true # disabled in metadata

2
util/fastint_reps.py

@ -11,7 +11,7 @@ def isFastint(x):
if math.floor(x) == x and \ if math.floor(x) == x and \
x >= -(2**47) and \ x >= -(2**47) and \
x < (2**47) and \ x < (2**47) and \
(x != 0 or math.copysign(1.0, x) == 1.0): (x != 0 or math.copysign(1.0, x) == 1.0): # Negative zero is a bit tricky
return True return True
return False return False

1
util/format_perftest.py

@ -1,6 +1,7 @@
#!/usr/bin/env python2 #!/usr/bin/env python2
# #
# Format a perftest text dump into a HTML table. # Format a perftest text dump into a HTML table.
#
import os import os
import sys import sys

7
util/gendoubleconsts.py

@ -1,8 +1,9 @@
#!/usr/bin/env python2 #!/usr/bin/env python2
#
# Double constants, see http://en.wikipedia.org/wiki/Double-precision_floating-point_format. # Double constants, see http://en.wikipedia.org/wiki/Double-precision_floating-point_format.
# Some double constants have been created with 'python-mpmath'. The constants are in binary # YAML builtins metadata expressed the constants in binary form (8 bytes of
# so that the package is not needed for a normal build. # IEEE double data) to ensure bit exactness.
#
import struct import struct
import mpmath import mpmath

1
util/genhashsizes.py

@ -8,6 +8,7 @@
# #
# Also generates a set of probe steps which are relatively prime to every # Also generates a set of probe steps which are relatively prime to every
# hash size. # hash size.
#
import sys import sys
import math import math

1
util/gennumdigits.py

@ -17,6 +17,7 @@
# values bounded. The exponent limit is relative to an integer # values bounded. The exponent limit is relative to an integer
# significand padded to the precision-related digit count (e.g. # significand padded to the precision-related digit count (e.g.
# 20 for decimal). # 20 for decimal).
#
import math import math

492
util/make_dist.py

@ -3,21 +3,6 @@
# Create a distributable Duktape package into 'dist' directory. The contents # Create a distributable Duktape package into 'dist' directory. The contents
# of this directory can then be packaged into a source distributable. # of this directory can then be packaged into a source distributable.
# #
# The distributed source files contain all profiles and variants in one.
# A developer should be able to use the distributed source as follows:
#
# 1. Add the Duktape source files to their project, whichever build
# tool they use (make, scons, etc)
#
# 2. Add the Duktape header files to their include path.
#
# 3. Optionally define some DUK_OPT_xxx feature options.
#
# 4. Compile their program (which uses Duktape API).
#
# In addition to sources, documentation, example programs, and some
# example Makefiles are packaged into the dist package.
#
import os import os
import sys import sys
@ -28,7 +13,7 @@ import optparse
import subprocess import subprocess
import tarfile import tarfile
# Helpers # Helpers.
def exec_get_stdout(cmd, input=None, default=None, print_stdout=False): def exec_get_stdout(cmd, input=None, default=None, print_stdout=False):
try: try:
@ -128,8 +113,9 @@ def glob_files(pattern):
def cstring(x): def cstring(x):
return '"' + x + '"' # good enough for now return '"' + x + '"' # good enough for now
# DUK_VERSION is grepped from duk_api_public.h.in: it is needed for the # Get Duktape version number as an integer. DUK_VERSION is grepped from
# public API and we want to avoid defining it in two places. # duk_api_public.h.in: it is needed for the public API and we want to avoid
# defining it in multiple places.
def get_duk_version(): def get_duk_version():
r = re.compile(r'^#define\s+DUK_VERSION\s+(.*?)L?\s*$') r = re.compile(r'^#define\s+DUK_VERSION\s+(.*?)L?\s*$')
with open(os.path.join('src', 'duk_api_public.h.in'), 'rb') as f: with open(os.path.join('src', 'duk_api_public.h.in'), 'rb') as f:
@ -187,30 +173,35 @@ def create_dist_directories(dist):
mkdir(os.path.join(dist, 'examples', 'dummy-date-provider')) mkdir(os.path.join(dist, 'examples', 'dummy-date-provider'))
mkdir(os.path.join(dist, 'examples', 'cpp-exceptions')) mkdir(os.path.join(dist, 'examples', 'cpp-exceptions'))
# Path check (spot check a few files to ensure we're in Duktape repo root) # Spot check a few files to ensure we're in Duktape repo root, as dist only
# works from there.
if not (os.path.isfile(os.path.join('src', 'duk_api_public.h.in')) and \ def check_cwd_duktape_repo_root():
if not (os.path.isfile(os.path.join('src', 'duk_api_public.h.in')) and \
os.path.isfile(os.path.join('config', 'platforms.yaml'))): os.path.isfile(os.path.join('config', 'platforms.yaml'))):
sys.stderr.write('\n') sys.stderr.write('\n')
sys.stderr.write('*** Working directory must be Duktape repo checkout root!\n') sys.stderr.write('*** Working directory must be Duktape repo checkout root!\n')
sys.stderr.write('\n') sys.stderr.write('\n')
raise Exception('Incorrect working directory') raise Exception('Incorrect working directory')
# Option parsing # Option parsing.
def parse_options():
parser = optparse.OptionParser()
parser.add_option('--create-spdx', dest='create_spdx', action='store_true', default=False, help='Create SPDX license file')
parser.add_option('--git-commit', dest='git_commit', default=None, help='Force git commit hash')
parser.add_option('--git-describe', dest='git_describe', default=None, help='Force git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Force git branch name')
parser.add_option('--rom-support', dest='rom_support', action='store_true', help='Deprecated, use prepare_sources.py instead')
parser.add_option('--rom-auto-lightfunc', dest='rom_auto_lightfunc', action='store_true', default=False, help='Deprecated, use prepare_sources.py instead')
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help='Deprecated, use prepare_sources.py instead')
(opts, args) = parser.parse_args()
parser = optparse.OptionParser() return opts, args
parser.add_option('--create-spdx', dest='create_spdx', action='store_true', default=False, help='Create SPDX license file')
parser.add_option('--git-commit', dest='git_commit', default=None, help='Force git commit hash')
parser.add_option('--git-describe', dest='git_describe', default=None, help='Force git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Force git branch name')
parser.add_option('--rom-support', dest='rom_support', action='store_true', help='Add support for ROM strings/objects (increases duktape.c size considerably)')
parser.add_option('--rom-auto-lightfunc', dest='rom_auto_lightfunc', action='store_true', default=False, help='Convert ROM built-in function properties into lightfuncs automatically whenever possible')
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help='User strings and objects to add, YAML format (can be repeated for multiple overrides)')
(opts, args) = parser.parse_args()
# Python module check and friendly errors # Python module check and friendly errors.
def check_python_modules(): def check_python_modules(opts):
# make_dist.py doesn't need yaml but other dist utils will; check for it and # make_dist.py doesn't need yaml but other dist utils will; check for it and
# warn if it is missing. # warn if it is missing.
failed = False failed = False
@ -243,50 +234,185 @@ def check_python_modules():
sys.stderr.write('\n') sys.stderr.write('\n')
raise Exception('Missing some required Python modules') raise Exception('Missing some required Python modules')
check_python_modules() def main():
# Basic option parsing, Python module check, CWD check.
opts, args = parse_options()
check_python_modules(opts)
check_cwd_duktape_repo_root()
# Obsolete options check.
if opts.rom_support or opts.rom_auto_lightfunc or len(opts.user_builtin_metadata) > 0:
raise Exception('obsolete ROM support argument(s), use tools/prepare_sources.py instead')
# Figure out directories, git info, etc # Figure out directories, git info, Duktape version, etc.
entry_pwd = os.getcwd() entry_pwd = os.getcwd()
dist = os.path.join(entry_pwd, 'dist') dist = os.path.join(entry_pwd, 'dist')
duk_version, duk_major, duk_minor, duk_patch, duk_version_formatted = get_duk_version() duk_version, duk_major, duk_minor, duk_patch, duk_version_formatted = get_duk_version()
if opts.git_commit is not None: if opts.git_commit is not None:
git_commit = opts.git_commit git_commit = opts.git_commit
else: else:
git_commit = exec_get_stdout([ 'git', 'rev-parse', 'HEAD' ], default='external').strip() git_commit = exec_get_stdout([ 'git', 'rev-parse', 'HEAD' ], default='external').strip()
if opts.git_describe is not None: if opts.git_describe is not None:
git_describe = opts.git_describe git_describe = opts.git_describe
else: else:
git_describe = exec_get_stdout([ 'git', 'describe', '--always', '--dirty' ], default='external').strip() git_describe = exec_get_stdout([ 'git', 'describe', '--always', '--dirty' ], default='external').strip()
if opts.git_branch is not None: if opts.git_branch is not None:
git_branch = opts.git_branch git_branch = opts.git_branch
else: else:
git_branch = exec_get_stdout([ 'git', 'rev-parse', '--abbrev-ref', 'HEAD' ], default='external').strip() git_branch = exec_get_stdout([ 'git', 'rev-parse', '--abbrev-ref', 'HEAD' ], default='external').strip()
git_commit_cstring = cstring(git_commit) git_commit_cstring = cstring(git_commit)
git_describe_cstring = cstring(git_describe) git_describe_cstring = cstring(git_describe)
git_branch_cstring = cstring(git_branch) git_branch_cstring = cstring(git_branch)
print('Dist for Duktape version %s, commit %s, describe %s, branch %s' % \ print('Dist for Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch)) (duk_version_formatted, git_commit, git_describe, git_branch))
print('Create dist directories and copy static files') # Create dist directory structure, copy files.
# Create dist directory structure print('Create dist directories and copy static files')
create_dist_directories(dist) create_dist_directories(dist)
# Copy most files directly os.chdir(entry_pwd)
os.chdir(entry_pwd) copy_files([
'builtins.yaml',
for fn in glob.glob(os.path.join('src', '*')): 'duk_alloc_default.c',
copy_file(fn, os.path.join(dist, 'src-input', os.path.basename(fn))) 'duk_api_buffer.c',
'duk_api_bytecode.c',
os.chdir(os.path.join(entry_pwd, 'config')) 'duk_api_call.c',
create_targz(os.path.join(dist, 'config', 'genconfig_metadata.tar.gz'), [ 'duk_api_codec.c',
'duk_api_compile.c',
'duk_api_debug.c',
'duk_api_heap.c',
'duk_api_internal.h',
'duk_api_memory.c',
'duk_api_object.c',
'duk_api_public.h.in',
'duk_api_stack.c',
'duk_api_string.c',
'duk_api_time.c',
'duk_bi_array.c',
'duk_bi_boolean.c',
'duk_bi_buffer.c',
'duk_bi_date.c',
'duk_bi_date_unix.c',
'duk_bi_date_windows.c',
'duk_bi_duktape.c',
'duk_bi_error.c',
'duk_bi_function.c',
'duk_bi_global.c',
'duk_bi_json.c',
'duk_bi_math.c',
'duk_bi_number.c',
'duk_bi_object.c',
'duk_bi_pointer.c',
'duk_bi_protos.h',
'duk_bi_proxy.c',
'duk_bi_regexp.c',
'duk_bi_string.c',
'duk_bi_thread.c',
'duk_bi_thrower.c',
'duk_dblunion.h.in',
'duk_debug_fixedbuffer.c',
'duk_debugger.c',
'duk_debugger.h',
'duk_debug.h',
'duk_debug_macros.c',
'duk_debug_vsnprintf.c',
'duk_error_augment.c',
'duk_error.h',
'duk_error_longjmp.c',
'duk_error_macros.c',
'duk_error_misc.c',
'duk_error_throw.c',
'duk_exception.h',
'duk_forwdecl.h',
'duk_harray.h',
'duk_hbuffer_alloc.c',
'duk_hbuffer.h',
'duk_hbuffer_ops.c',
'duk_hbufobj.h',
'duk_hbufobj_misc.c',
'duk_hcompfunc.h',
'duk_heap_alloc.c',
'duk_heap.h',
'duk_heap_hashstring.c',
'duk_heaphdr.h',
'duk_heap_markandsweep.c',
'duk_heap_memory.c',
'duk_heap_misc.c',
'duk_heap_refcount.c',
'duk_heap_stringcache.c',
'duk_heap_stringtable.c',
'duk_hnatfunc.h',
'duk_hobject_alloc.c',
'duk_hobject_class.c',
'duk_hobject_enum.c',
'duk_hobject_finalizer.c',
'duk_hobject.h',
'duk_hobject_misc.c',
'duk_hobject_pc2line.c',
'duk_hobject_props.c',
'duk_hstring.h',
'duk_hstring_misc.c',
'duk_hthread_alloc.c',
'duk_hthread_builtins.c',
'duk_hthread.h',
'duk_hthread_misc.c',
'duk_hthread_stacks.c',
'duk_internal.h',
'duk_jmpbuf.h',
'duk_js_bytecode.h',
'duk_js_call.c',
'duk_js_compiler.c',
'duk_js_compiler.h',
'duk_js_executor.c',
'duk_js.h',
'duk_json.h',
'duk_js_ops.c',
'duk_js_var.c',
'duk_lexer.c',
'duk_lexer.h',
'duk_numconv.c',
'duk_numconv.h',
'duk_regexp_compiler.c',
'duk_regexp_executor.c',
'duk_regexp.h',
'duk_replacements.c',
'duk_replacements.h',
'duk_selftest.c',
'duk_selftest.h',
'duk_strings.h',
'duktape.h.in',
'duk_tval.c',
'duk_tval.h',
'duk_unicode.h',
'duk_unicode_support.c',
'duk_unicode_tables.c',
'duk_util_bitdecoder.c',
'duk_util_bitencoder.c',
'duk_util_bufwriter.c',
'duk_util.h',
'duk_util_hashbytes.c',
'duk_util_hashprime.c',
'duk_util_misc.c',
'duk_util_tinyrandom.c',
'strings.yaml',
'SpecialCasing.txt',
'SpecialCasing-8bit.txt',
'UnicodeData.txt',
'UnicodeData-8bit.txt',
], 'src', os.path.join(dist, 'src-input'))
os.chdir(os.path.join(entry_pwd, 'config'))
create_targz(os.path.join(dist, 'config', 'genconfig_metadata.tar.gz'), [
'tags.yaml', 'tags.yaml',
'platforms.yaml', 'platforms.yaml',
'architectures.yaml', 'architectures.yaml',
@ -300,10 +426,10 @@ create_targz(os.path.join(dist, 'config', 'genconfig_metadata.tar.gz'), [
'header-snippets', 'header-snippets',
'other-defines', 'other-defines',
'examples' 'examples'
]) ])
os.chdir(entry_pwd) os.chdir(entry_pwd)
copy_files([ copy_files([
'prepare_sources.py', 'prepare_sources.py',
'combine_src.py', 'combine_src.py',
'create_spdx_license.py', 'create_spdx_license.py',
@ -313,7 +439,6 @@ copy_files([
'extract_caseconv.py', 'extract_caseconv.py',
'extract_chars.py', 'extract_chars.py',
'extract_unique_options.py', 'extract_unique_options.py',
'genbuildparams.py',
'genbuiltins.py', 'genbuiltins.py',
'genconfig.py', 'genconfig.py',
'json2yaml.py', 'json2yaml.py',
@ -323,16 +448,13 @@ copy_files([
'scan_strings.py', 'scan_strings.py',
'scan_used_stridx_bidx.py', 'scan_used_stridx_bidx.py',
'yaml2json.py', 'yaml2json.py',
], 'tools', os.path.join(dist, 'tools')) ], 'tools', os.path.join(dist, 'tools'))
# XXX: Copy genconfig.py also to config/genconfig.py for now. copy_files([
copy_file(os.path.join(dist, 'tools', 'genconfig.py'), os.path.join(dist, 'config', 'genconfig.py'))
copy_files([
'README.rst' 'README.rst'
], 'config', os.path.join(dist, 'config')) ], 'config', os.path.join(dist, 'config'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'Makefile', 'Makefile',
'package.json', 'package.json',
@ -342,14 +464,15 @@ copy_files([
'duk_debugcommands.yaml', 'duk_debugcommands.yaml',
'duk_debugerrors.yaml', 'duk_debugerrors.yaml',
'duk_opcodes.yaml' 'duk_opcodes.yaml'
], 'debugger', os.path.join(dist, 'debugger')) ], 'debugger', os.path.join(dist, 'debugger'))
copy_files([
copy_files([
'index.html', 'index.html',
'style.css', 'style.css',
'webui.js' 'webui.js'
], os.path.join('debugger', 'static'), os.path.join(dist, 'debugger', 'static')) ], os.path.join('debugger', 'static'), os.path.join(dist, 'debugger', 'static'))
copy_files([ copy_files([
'console-minimal.js', 'console-minimal.js',
'object-prototype-definegetter.js', 'object-prototype-definegetter.js',
'object-prototype-definesetter.js', 'object-prototype-definesetter.js',
@ -359,19 +482,19 @@ copy_files([
'duktape-error-setter-writable.js', 'duktape-error-setter-writable.js',
'duktape-error-setter-nonwritable.js', 'duktape-error-setter-nonwritable.js',
'duktape-buffer.js' 'duktape-buffer.js'
], 'polyfills', os.path.join(dist, 'polyfills')) ], 'polyfills', os.path.join(dist, 'polyfills'))
copy_files([ copy_files([
'README.rst' 'README.rst'
], 'examples', os.path.join(dist, 'examples')) ], 'examples', os.path.join(dist, 'examples'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_cmdline.c', 'duk_cmdline.c',
'duk_cmdline_ajduk.c' 'duk_cmdline_ajduk.c'
], os.path.join('examples', 'cmdline'), os.path.join(dist, 'examples', 'cmdline')) ], os.path.join('examples', 'cmdline'), os.path.join(dist, 'examples', 'cmdline'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'c_eventloop.c', 'c_eventloop.c',
'c_eventloop.js', 'c_eventloop.js',
@ -385,19 +508,19 @@ copy_files([
'basic-test.js', 'basic-test.js',
'server-socket-test.js', 'server-socket-test.js',
'client-socket-test.js' 'client-socket-test.js'
], os.path.join('examples', 'eventloop'), os.path.join(dist, 'examples', 'eventloop')) ], os.path.join('examples', 'eventloop'), os.path.join(dist, 'examples', 'eventloop'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'hello.c' 'hello.c'
], os.path.join('examples', 'hello'), os.path.join(dist, 'examples', 'hello')) ], os.path.join('examples', 'hello'), os.path.join(dist, 'examples', 'hello'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'eval.c' 'eval.c'
], os.path.join('examples', 'eval'), os.path.join(dist, 'examples', 'eval')) ], os.path.join('examples', 'eval'), os.path.join(dist, 'examples', 'eval'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'fib.js', 'fib.js',
'process.js', 'process.js',
@ -405,90 +528,89 @@ copy_files([
'prime.js', 'prime.js',
'primecheck.c', 'primecheck.c',
'uppercase.c' 'uppercase.c'
], os.path.join('examples', 'guide'), os.path.join(dist, 'examples', 'guide')) ], os.path.join('examples', 'guide'), os.path.join(dist, 'examples', 'guide'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'globals.coffee', 'globals.coffee',
'hello.coffee', 'hello.coffee',
'mandel.coffee' 'mandel.coffee'
], os.path.join('examples', 'coffee'), os.path.join(dist, 'examples', 'coffee')) ], os.path.join('examples', 'coffee'), os.path.join(dist, 'examples', 'coffee'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'jxpretty.c' 'jxpretty.c'
], os.path.join('examples', 'jxpretty'), os.path.join(dist, 'examples', 'jxpretty')) ], os.path.join('examples', 'jxpretty'), os.path.join(dist, 'examples', 'jxpretty'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'sandbox.c' 'sandbox.c'
], os.path.join('examples', 'sandbox'), os.path.join(dist, 'examples', 'sandbox')) ], os.path.join('examples', 'sandbox'), os.path.join(dist, 'examples', 'sandbox'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_alloc_logging.c', 'duk_alloc_logging.c',
'duk_alloc_logging.h', 'duk_alloc_logging.h',
'log2gnuplot.py' 'log2gnuplot.py'
], os.path.join('examples', 'alloc-logging'), os.path.join(dist, 'examples', 'alloc-logging')) ], os.path.join('examples', 'alloc-logging'), os.path.join(dist, 'examples', 'alloc-logging'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_alloc_torture.c', 'duk_alloc_torture.c',
'duk_alloc_torture.h' 'duk_alloc_torture.h'
], os.path.join('examples', 'alloc-torture'), os.path.join(dist, 'examples', 'alloc-torture')) ], os.path.join('examples', 'alloc-torture'), os.path.join(dist, 'examples', 'alloc-torture'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_alloc_hybrid.c', 'duk_alloc_hybrid.c',
'duk_alloc_hybrid.h' 'duk_alloc_hybrid.h'
], os.path.join('examples', 'alloc-hybrid'), os.path.join(dist, 'examples', 'alloc-hybrid')) ], os.path.join('examples', 'alloc-hybrid'), os.path.join(dist, 'examples', 'alloc-hybrid'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_trans_socket_unix.c', 'duk_trans_socket_unix.c',
'duk_trans_socket_windows.c', 'duk_trans_socket_windows.c',
'duk_trans_socket.h' 'duk_trans_socket.h'
], os.path.join('examples', 'debug-trans-socket'), os.path.join(dist, 'examples', 'debug-trans-socket')) ], os.path.join('examples', 'debug-trans-socket'), os.path.join(dist, 'examples', 'debug-trans-socket'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_trans_dvalue.c', 'duk_trans_dvalue.c',
'duk_trans_dvalue.h', 'duk_trans_dvalue.h',
'test.c', 'test.c',
'Makefile' 'Makefile'
], os.path.join('examples', 'debug-trans-dvalue'), os.path.join(dist, 'examples', 'debug-trans-dvalue')) ], os.path.join('examples', 'debug-trans-dvalue'), os.path.join(dist, 'examples', 'debug-trans-dvalue'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_codepage_conv.c', 'duk_codepage_conv.c',
'duk_codepage_conv.h', 'duk_codepage_conv.h',
'test.c' 'test.c'
], os.path.join('examples', 'codepage-conv'), os.path.join(dist, 'examples', 'codepage-conv')) ], os.path.join('examples', 'codepage-conv'), os.path.join(dist, 'examples', 'codepage-conv'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'dummy_date_provider.c' 'dummy_date_provider.c'
], os.path.join('examples', 'dummy-date-provider'), os.path.join(dist, 'examples', 'dummy-date-provider')) ], os.path.join('examples', 'dummy-date-provider'), os.path.join(dist, 'examples', 'dummy-date-provider'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'cpp_exceptions.cpp' 'cpp_exceptions.cpp'
], os.path.join('examples', 'cpp-exceptions'), os.path.join(dist, 'examples', 'cpp-exceptions')) ], os.path.join('examples', 'cpp-exceptions'), os.path.join(dist, 'examples', 'cpp-exceptions'))
copy_files([ copy_files([
'README.rst' 'README.rst'
], 'extras', os.path.join(dist, 'extras')) ], 'extras', os.path.join(dist, 'extras'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_logging.c', 'duk_logging.c',
'duk_logging.h', 'duk_logging.h',
'test.c', 'test.c',
'Makefile' 'Makefile'
], os.path.join('extras', 'logging'), os.path.join(dist, 'extras', 'logging')) ], os.path.join('extras', 'logging'), os.path.join(dist, 'extras', 'logging'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_v1_compat.c', 'duk_v1_compat.c',
'duk_v1_compat.h', 'duk_v1_compat.h',
@ -498,49 +620,49 @@ copy_files([
'test_eval2.js', 'test_eval2.js',
'test_compile1.js', 'test_compile1.js',
'test_compile2.js' 'test_compile2.js'
], os.path.join('extras', 'duk-v1-compat'), os.path.join(dist, 'extras', 'duk-v1-compat')) ], os.path.join('extras', 'duk-v1-compat'), os.path.join(dist, 'extras', 'duk-v1-compat'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_print_alert.c', 'duk_print_alert.c',
'duk_print_alert.h', 'duk_print_alert.h',
'test.c', 'test.c',
'Makefile' 'Makefile'
], os.path.join('extras', 'print-alert'), os.path.join(dist, 'extras', 'print-alert')) ], os.path.join('extras', 'print-alert'), os.path.join(dist, 'extras', 'print-alert'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_console.c', 'duk_console.c',
'duk_console.h', 'duk_console.h',
'test.c', 'test.c',
'Makefile' 'Makefile'
], os.path.join('extras', 'console'), os.path.join(dist, 'extras', 'console')) ], os.path.join('extras', 'console'), os.path.join(dist, 'extras', 'console'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_minimal_printf.c', 'duk_minimal_printf.c',
'duk_minimal_printf.h', 'duk_minimal_printf.h',
'Makefile', 'Makefile',
'test.c' 'test.c'
], os.path.join('extras', 'minimal-printf'), os.path.join(dist, 'extras', 'minimal-printf')) ], os.path.join('extras', 'minimal-printf'), os.path.join(dist, 'extras', 'minimal-printf'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_module_duktape.c', 'duk_module_duktape.c',
'duk_module_duktape.h', 'duk_module_duktape.h',
'Makefile', 'Makefile',
'test.c' 'test.c'
], os.path.join('extras', 'module-duktape'), os.path.join(dist, 'extras', 'module-duktape')) ], os.path.join('extras', 'module-duktape'), os.path.join(dist, 'extras', 'module-duktape'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_module_node.c', 'duk_module_node.c',
'duk_module_node.h', 'duk_module_node.h',
'Makefile', 'Makefile',
'test.c' 'test.c'
], os.path.join('extras', 'module-node'), os.path.join(dist, 'extras', 'module-node')) ], os.path.join('extras', 'module-node'), os.path.join(dist, 'extras', 'module-node'))
copy_files([ copy_files([
'README.rst', 'README.rst',
'duk_alloc_pool.c', 'duk_alloc_pool.c',
'duk_alloc_pool.h', 'duk_alloc_pool.h',
@ -548,9 +670,9 @@ copy_files([
'ptrcomp_fixup.h', 'ptrcomp_fixup.h',
'Makefile', 'Makefile',
'test.c' 'test.c'
], os.path.join('extras', 'alloc-pool'), os.path.join(dist, 'extras', 'alloc-pool')) ], os.path.join('extras', 'alloc-pool'), os.path.join(dist, 'extras', 'alloc-pool'))
copy_files([ copy_files([
'Makefile.cmdline', 'Makefile.cmdline',
'Makefile.dukdebug', 'Makefile.dukdebug',
'Makefile.eventloop', 'Makefile.eventloop',
@ -561,55 +683,59 @@ copy_files([
'Makefile.sandbox', 'Makefile.sandbox',
'Makefile.codepage', 'Makefile.codepage',
'mandel.js' 'mandel.js'
], 'dist-files', dist) ], 'dist-files', dist)
copy_and_replace(os.path.join('dist-files', 'Makefile.sharedlibrary'), os.path.join(dist, 'Makefile.sharedlibrary'), {
copy_and_replace(os.path.join('dist-files', 'Makefile.sharedlibrary'), os.path.join(dist, 'Makefile.sharedlibrary'), {
'@DUK_VERSION@': str(duk_version), '@DUK_VERSION@': str(duk_version),
'@SONAME_VERSION@': str(int(duk_version / 100)) # 10500 -> 105 '@SONAME_VERSION@': str(int(duk_version / 100)) # 10500 -> 105
}) })
copy_and_replace(os.path.join('dist-files', 'README.rst'), os.path.join(dist, 'README.rst'), { copy_and_replace(os.path.join('dist-files', 'README.rst'), os.path.join(dist, 'README.rst'), {
'@DUK_VERSION_FORMATTED@': duk_version_formatted, '@DUK_VERSION_FORMATTED@': duk_version_formatted,
'@GIT_COMMIT@': git_commit, '@GIT_COMMIT@': git_commit,
'@GIT_DESCRIBE@': git_describe, '@GIT_DESCRIBE@': git_describe,
'@GIT_BRANCH@': git_branch '@GIT_BRANCH@': git_branch
}) })
copy_files([ copy_files([
'LICENSE.txt', # not strict RST so keep .txt suffix 'LICENSE.txt', # not strict RST so keep .txt suffix
'AUTHORS.rst' 'AUTHORS.rst'
], '.', os.path.join(dist)) ], '.', os.path.join(dist))
# RELEASES.rst is only updated in master. It's not included in the dist to # RELEASES.rst is only updated in master. It's not included in the dist to
# make maintenance fixes easier to make. # make maintenance fixes easier to make.
copy_files([ copy_files([
'murmurhash2.txt', 'murmurhash2.txt',
'lua.txt', 'lua.txt',
'commonjs.txt' 'commonjs.txt'
], 'licenses', os.path.join(dist, 'licenses')) ], 'licenses', os.path.join(dist, 'licenses'))
# Merge debugger metadata.
# Merge debugger metadata. merged = exec_print_stdout([
merged = exec_print_stdout([
sys.executable, os.path.join('tools', 'merge_debug_meta.py'), sys.executable, os.path.join('tools', 'merge_debug_meta.py'),
'--output', os.path.join(dist, 'debugger', 'duk_debug_meta.json'), '--output', os.path.join(dist, 'debugger', 'duk_debug_meta.json'),
'--class-names', os.path.join('debugger', 'duk_classnames.yaml'), '--class-names', os.path.join('debugger', 'duk_classnames.yaml'),
'--debug-commands', os.path.join('debugger', 'duk_debugcommands.yaml'), '--debug-commands', os.path.join('debugger', 'duk_debugcommands.yaml'),
'--debug-errors', os.path.join('debugger', 'duk_debugerrors.yaml'), '--debug-errors', os.path.join('debugger', 'duk_debugerrors.yaml'),
'--opcodes', os.path.join('debugger', 'duk_opcodes.yaml') '--opcodes', os.path.join('debugger', 'duk_opcodes.yaml')
]) ])
# Build some example duk_config.h headers. This may go away later.
print('Create duk_config.h headers') print('Create duk_config.h headers')
# Build duk_config.h without feature option support. exec_print_stdout([
exec_print_stdout([
sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config', sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config',
'--output', os.path.join(dist, 'config', 'duk_config.h-modular-static'), '--output', os.path.join(dist, 'config', 'duk_config.h-modular-static'),
'--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch, '--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch,
'--omit-removed-config-options', '--omit-unused-config-options', '--omit-removed-config-options', '--omit-unused-config-options',
'--emit-legacy-feature-check', '--emit-config-sanity-check', '--emit-legacy-feature-check', '--emit-config-sanity-check',
'duk-config-header' 'duk-config-header'
]) ])
exec_print_stdout([
exec_print_stdout([
sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config', sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config',
'--output', os.path.join(dist, 'config', 'duk_config.h-modular-dll'), '--output', os.path.join(dist, 'config', 'duk_config.h-modular-dll'),
'--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch, '--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch,
@ -617,10 +743,9 @@ exec_print_stdout([
'--emit-legacy-feature-check', '--emit-config-sanity-check', '--emit-legacy-feature-check', '--emit-config-sanity-check',
'--dll', '--dll',
'duk-config-header' 'duk-config-header'
]) ])
# Generate a few barebones config examples def genconfig_barebones(platform, architecture, compiler):
def genconfig_barebones(platform, architecture, compiler):
exec_print_stdout([ exec_print_stdout([
sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config', sys.executable, os.path.join('tools', 'genconfig.py'), '--metadata', 'config',
'--output', os.path.join(dist, 'config', 'duk_config.h-%s-%s-%s' % (platform, architecture, compiler)), '--output', os.path.join(dist, 'config', 'duk_config.h-%s-%s-%s' % (platform, architecture, compiler)),
@ -631,21 +756,23 @@ def genconfig_barebones(platform, architecture, compiler):
'duk-config-header' 'duk-config-header'
]) ])
#genconfig_barebones('linux', 'x86', 'gcc') #genconfig_barebones('linux', 'x86', 'gcc')
#genconfig_barebones('linux', 'x64', 'gcc') #genconfig_barebones('linux', 'x64', 'gcc')
#genconfig_barebones('linux', 'x86', 'clang') #genconfig_barebones('linux', 'x86', 'clang')
#genconfig_barebones('linux', 'x64', 'clang') #genconfig_barebones('linux', 'x64', 'clang')
#genconfig_barebones('windows', 'x86', 'msvc') #genconfig_barebones('windows', 'x86', 'msvc')
#genconfig_barebones('windows', 'x64', 'msvc') #genconfig_barebones('windows', 'x64', 'msvc')
#genconfig_barebones('apple', 'x86', 'gcc') #genconfig_barebones('apple', 'x86', 'gcc')
#genconfig_barebones('apple', 'x64', 'gcc') #genconfig_barebones('apple', 'x64', 'gcc')
#genconfig_barebones('apple', 'x86', 'clang') #genconfig_barebones('apple', 'x86', 'clang')
#genconfig_barebones('apple', 'x64', 'clang') #genconfig_barebones('apple', 'x64', 'clang')
# Build prepared sources (src/, src-noline/, src-separate/) with default # Build prepared sources (src/, src-noline/, src-separate/) with default
# config. This is done using tools and metadata in the dist directory. # config. This is done using tools and metadata in the dist directory.
print('Config-and-prepare sources for default configuration')
cmd = [ print('Config-and-prepare sources for default configuration')
cmd = [
sys.executable, os.path.join(dist, 'tools', 'prepare_sources.py'), sys.executable, os.path.join(dist, 'tools', 'prepare_sources.py'),
'--source-directory', os.path.join(dist, 'src-input'), '--source-directory', os.path.join(dist, 'src-input'),
'--output-directory', dist, '--output-directory', dist,
@ -653,21 +780,23 @@ cmd = [
'--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch, '--git-commit', git_commit, '--git-describe', git_describe, '--git-branch', git_branch,
'--omit-removed-config-options', '--omit-unused-config-options', '--omit-removed-config-options', '--omit-unused-config-options',
'--emit-config-sanity-check', '--support-feature-options' '--emit-config-sanity-check', '--support-feature-options'
] ]
if opts.rom_support: if opts.rom_support:
cmd.append('--rom-support') cmd.append('--rom-support')
if opts.rom_auto_lightfunc: if opts.rom_auto_lightfunc:
cmd.append('--rom-auto-lightfunc') cmd.append('--rom-auto-lightfunc')
for i in opts.user_builtin_metadata: for i in opts.user_builtin_metadata:
cmd.append('--user-builtin-metadata') cmd.append('--user-builtin-metadata')
cmd.append(i) cmd.append(i)
exec_print_stdout(cmd) exec_print_stdout(cmd)
# Clean up remaining temp files # Clean up remaining temp files.
delete_matching_files(dist, lambda x: x[-4:] == '.tmp')
# Create SPDX license once all other files are in place (and cleaned) delete_matching_files(dist, lambda x: x[-4:] == '.tmp')
if opts.create_spdx:
# Create SPDX license once all other files are in place (and cleaned).
if opts.create_spdx:
print('Create SPDX license') print('Create SPDX license')
try: try:
exec_get_stdout([ exec_get_stdout([
@ -681,7 +810,10 @@ if opts.create_spdx:
print('*** WARNING: Failed to create SPDX license, this should not happen for an official release!') print('*** WARNING: Failed to create SPDX license, this should not happen for an official release!')
print('***') print('***')
print('') print('')
else: else:
print('Skip SPDX license creation') print('Skip SPDX license creation')
print('Dist finished successfully') print('Dist finished successfully')
if __name__ == '__main__':
main()

2
util/prep_test.py

@ -4,7 +4,7 @@
# #
# For Ecmascript testcases: # For Ecmascript testcases:
# #
# - Lift a 'use global' statement to top of file if present # - Lift a 'use strict' statement to top of file if present
# - Add a prologue which handles engine differences # - Add a prologue which handles engine differences
# - Resolve include files # - Resolve include files
# #

64
util/remove_bufferobject_properties.yaml

@ -0,0 +1,64 @@
objects:
# The bufferobject related built-ins have a DUK_BIDX_xxx constant so they
# must at present map to an object value (they may not be missing or map
# to e.g. null).
- id: bi_empty_bufferobject
class: Object
internal_prototype: bi_object_prototype
properties: []
- id: bi_global
modify: true
properties:
- key: 'DataView'
value:
type: object
id: bi_empty_bufferobject
- key: 'Uint8Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Uint8ClampedArray'
value:
type: object
id: bi_empty_bufferobject
- key: 'Int8Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Uint16Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Int16Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Uint32Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Int32Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Float32Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'Float64Array'
value:
type: object
id: bi_empty_bufferobject
- key: 'ArrayBuffer'
value:
type: object
id: bi_empty_bufferobject
- key: 'Buffer'
value:
type: object
id: bi_empty_bufferobject

8
util/remove_math_properties.yaml

@ -0,0 +1,8 @@
objects:
- id: bi_math
replace: true
class: Object
internal_prototype: bi_object_prototype
properties: []

10
util/underscore-test-shim.js

@ -188,3 +188,13 @@ $.browser = {
var jQuery = $; var jQuery = $;
// document.createElement() is used by some tests
document.createElement = function createElement(name) {
return { name: name }; // Just a dummy return value
};
// the Duktape require() function confuses underscore tests
delete require;
// ... and the 'window' binding is expected
var window = new Function('return this;')();

1
util/underscore_test.sh

@ -19,6 +19,7 @@ else
fi fi
cat util/underscore-test-shim.js \ cat util/underscore-test-shim.js \
underscore/test/vendor/qunit.js \
underscore/underscore.js \ underscore/underscore.js \
$TEST \ $TEST \
> /tmp/duk-underscore-test.js > /tmp/duk-underscore-test.js

Loading…
Cancel
Save