Browse Source

Merge pull request #948 from svaarala/more-configure-cleanups

More cleanups for dist/configure tools
pull/950/head
Sami Vaarala 8 years ago
committed by GitHub
parent
commit
ececd0bca6
  1. 1
      .gitignore
  2. 17
      README.md
  3. 5
      config/config-options/DUK_USE_EXAMPLE.yaml
  4. 1
      config/config-options/DUK_USE_FATAL_HANDLER.yaml
  5. 23
      doc/release-notes-v2-0.rst
  6. 64
      testrunner/client-simple-node/run_commit_test.py
  7. 33
      tools/combine_src.py
  8. 103
      tools/configure.py
  9. 189
      tools/genbuiltins.py
  10. 101
      tools/genconfig.py
  11. 20
      tools/merge_debug_meta.py
  12. 2
      tools/prepare_unicode_data.py
  13. 108
      util/dist.py
  14. 12
      util/example_rombuild.sh

1
.gitignore

@ -2,6 +2,7 @@
*.swp
*.o
*.strip
*.pyc
/_*
/duk.*
/libduktape.*

17
README.md

@ -63,6 +63,23 @@ for the basics.
Automatically generated bleeding edge snapshots from master are available at
[duktape.org/snapshots](http://duktape.org/snapshots).
The distributable `src/` directory contains a `duk_config.h` configuration
header and amalgamated sources for Duktape default configuration. Use
`python tools/configure.py` to create header and sources for customized
configuration options, see http://wiki.duktape.org/Configuring.html. For
example, to enable fastint support (example for Linux):
$ tar xvfJ duktape-2.0.0.tar.xz
$ cd duktape-2.0.0
$ rm -rf src-custom
$ python tools/configure.py \
--source-directory src-input \
--output-directory src-custom \
--config-metadata config \
-DDUK_USE_FASTINT
# src-custom/ will now contain: duktape.c, duktape.h, duk_config.h.
You can also clone this repository, make modifications, and build a source
distributable on Linux, OSX, and Windows using `python util/dist.py`.

5
config/config-options/DUK_USE_EXAMPLE.yaml

@ -81,6 +81,11 @@ tags:
- lowmemory
- experimental
# When set to true, genconfig.py will warn if no forced value is provided.
# This should be used sparingly, for options which are really strongly
# recommended so that configure.py output should warn about it.
#warn_if_missing: true
# Description for option, no newlines. Line breaking for e.g. C header
# is automatic.
description: >

1
config/config-options/DUK_USE_FATAL_HANDLER.yaml

@ -8,6 +8,7 @@ introduced: 2.0.0
default: false
tags:
- portability
warn_if_missing: true
description: >
Provide a custom default fatal error handler to replace the built-in one
(which causes an intentional segfault and forever loops). The default

23
doc/release-notes-v2-0.rst

@ -72,8 +72,11 @@ Tooling changes
There are some tooling changes in this release:
* The distributable now includes raw sources (``src/`` in Duktape main repo)
in ``src-input/`` and some tooling in ``tools/``.
* The distributable now includes raw sources in ``src-input/`` and some
tooling in ``tools/``. This allows Duktape sources to be modified and
re-amalgamated directly from the distributable. The distributable still
includes sources prepared using default configuration (``src/``,
``src-noline/``, and ``src-separate``) and some configuration examples.
* The tooling includes a new ``tools/configure.py`` tool which creates
a ``duk_config.h`` and matching prepared sources simultaneously. This
@ -84,14 +87,11 @@ There are some tooling changes in this release:
``dist.py`` and no longer supports ``--rom-support``,
``--rom-auto-lightfunc``, and ``--user-builtin-metadata`` options. Use
the ``tools/configure.py`` tool instead, which supports these options.
* The distributable still includes sources prepared using default configuration
(``src/``, ``src-noline/``, and ``src-separate``) and some configuration
examples.
However, ``--user-builtin-metadata`` has been renamed ``--builtin-file``.
* The ``config/genconfig.py`` has been relocated to ``tools/genconfig.py`` in
the distributable. It can still be used as a standalone tool, but over time
the intent is that configuration and sources are prepared in one atomic step.
the distributable. It can still be used as a standalone tool, but using
configure.py is recommended instead.
To upgrade:
@ -99,15 +99,16 @@ To upgrade:
distributable, no changes are needed.
* If you're using ``genconfig.py``, check the path; correct path is now
``tools/genconfig.py``.
``tools/genconfig.py``. Consider replacing genconfig.py with configure.py.
* If you're using ROM built-ins via ``make_dist.py``, change your build to
use ``tools/configure.py`` instead.
use ``tools/configure.py`` instead, and rename ``--user-builtins-metadata``
options to ``--builtin-file``.
Dist package file changes
-------------------------
* Configuration metadata is now in unpacked form on ``dist/config`` to match
* Configuration metadata is now in unpacked form in ``dist/config`` to match
the Duktape master repo and make config files more convenient to patch.
The ``dist/tools/genconfig.py`` tool no longer accepts a tar.gz metadata
argument.

64
testrunner/client-simple-node/run_commit_test.py

@ -268,14 +268,14 @@ def context_linux_x64_duk_gxx():
def context_helper_get_binary_size_diff(compfn):
cwd = os.getcwd()
execute([ 'git', 'clean', '-f' ])
execute([ 'git', 'reset', '--hard' ])
execute([ 'git', 'reset', '--quiet', '--hard' ])
compfn()
newsz = get_binary_size(os.path.join(cwd, 'duk'))
execute([ 'git', 'clean', '-f' ])
execute([ 'git', 'reset', '--hard' ])
execute([ 'git', 'checkout', 'master' ])
execute([ 'git', 'reset', '--quiet', '--hard' ])
execute([ 'git', 'checkout', '--quiet', 'master' ])
execute([ 'git', 'clean', '-f' ])
execute([ 'git', 'reset', '--hard' ])
execute([ 'git', 'reset', '--quiet', '--hard' ])
execute([ 'make', 'clean' ])
compfn()
oldsz = get_binary_size(os.path.join(cwd, 'duk'))
@ -308,24 +308,21 @@ def context_linux_x64_gcc_defsize_fltoetc():
def context_helper_minsize_fltoetc(archopt, strip):
cwd = os.getcwd()
def comp():
execute([ 'make', 'dist' ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-noline') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-separate') ])
execute([ 'make', 'clean' ])
execute([ 'rm', '-rf', os.path.join(cwd, 'prep') ])
cmd = [
'python2', os.path.join(cwd, 'dist', 'tools', 'configure.py'),
'--source-directory', os.path.join(cwd, 'dist', 'src-input'),
'--output-directory', os.path.join(cwd, 'dist'),
'--config-metadata', os.path.join(cwd, 'dist', 'config'),
'python2', os.path.join(cwd, 'tools', 'configure.py'),
'--source-directory', os.path.join(cwd, 'src'),
'--output-directory', os.path.join(cwd, 'prep'),
'--config-metadata', os.path.join(cwd, 'config'),
'--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory.yaml')
]
if strip:
cmd += [
'--option-file', os.path.join(cwd, 'config', 'examples', 'low_memory_strip.yaml'),
'--unicode-data', os.path.join(cwd, 'dist', 'src-input', 'UnicodeData-8bit.txt'),
'--special-casing', os.path.join(cwd, 'dist', 'src-input', 'SpecialCasing-8bit.txt')
'--unicode-data', os.path.join(cwd, 'src', 'UnicodeData-8bit.txt'),
'--special-casing', os.path.join(cwd, 'src', 'SpecialCasing-8bit.txt')
]
execute(cmd)
execute([
@ -333,10 +330,10 @@ def context_helper_minsize_fltoetc(archopt, strip):
'-Os', '-fomit-frame-pointer',
'-flto', '-fno-asynchronous-unwind-tables',
'-ffunction-sections', '-Wl,--gc-sections',
'-I' + os.path.join('dist', 'src'),
'-I' + os.path.join('dist', 'examples', 'cmdline'),
os.path.join(cwd, 'dist', 'src', 'duktape.c'),
os.path.join(cwd, 'dist', 'examples', 'cmdline', 'duk_cmdline.c'),
'-I' + os.path.join('prep'),
'-I' + os.path.join('examples', 'cmdline'),
os.path.join(cwd, 'prep', 'duktape.c'),
os.path.join(cwd, 'examples', 'cmdline', 'duk_cmdline.c'),
'-lm'
])
res = execute([
@ -650,17 +647,14 @@ def context_helper_hello_ram(archopt):
def test(genconfig_opts):
os.chdir(cwd)
execute([ 'make', 'clean', 'dist' ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-noline') ])
execute([ 'rm', '-rf', os.path.join(cwd, 'dist', 'src-separate') ])
execute([ 'make', 'clean' ])
execute([ 'rm', '-rf', os.path.join(cwd, 'prep') ])
cmd = [
'python2', os.path.join(cwd, 'dist', 'tools', 'configure.py'),
'--source-directory', os.path.join(cwd, 'dist', 'src-input'),
'--output-directory', os.path.join(cwd, 'dist'),
'--config-metadata', os.path.join(cwd, 'dist', 'config'),
'python2', os.path.join(cwd, 'tools', 'configure.py'),
'--source-directory', os.path.join(cwd, 'src'),
'--output-directory', os.path.join(cwd, 'prep'),
'--config-metadata', os.path.join(cwd, 'config'),
'--rom-support'
] + genconfig_opts
print(repr(cmd))
@ -670,9 +664,9 @@ def context_helper_hello_ram(archopt):
'-Os', '-fomit-frame-pointer',
'-flto', '-fno-asynchronous-unwind-tables',
'-ffunction-sections', '-Wl,--gc-sections',
'-I' + os.path.join('dist', 'src'),
os.path.join(cwd, 'dist', 'src', 'duktape.c'),
os.path.join(cwd, 'dist', 'examples', 'hello', 'hello.c'),
'-I' + os.path.join('prep'),
os.path.join(cwd, 'prep', 'duktape.c'),
os.path.join(cwd, 'examples', 'hello', 'hello.c'),
'-lm'
])
execute([
@ -982,12 +976,12 @@ def main():
unpack_targz(os.path.join(repo_snapshot_dir, fn))
execute([ 'git', 'clean', '-f' ])
execute([ 'git', 'reset', '--hard' ])
execute([ 'git', 'checkout', 'master' ])
execute([ 'git', 'reset', '--quiet', '--hard' ])
execute([ 'git', 'checkout', '--quiet', 'master' ])
execute([ 'git', 'pull', '--rebase' ])
execute([ 'git', 'clean', '-f' ])
execute([ 'git', 'reset', '--hard' ])
execute([ 'git', 'checkout', commit_name ])
execute([ 'git', 'reset', '--quiet', '--hard' ])
execute([ 'git', 'checkout', '--quiet', commit_name ])
execute([ 'git', 'describe', '--always', '--dirty' ])
fn = context_handlers.get(context)

33
tools/combine_src.py

@ -44,11 +44,17 @@
# defines must actually be different between two or more source files.
#
import os
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('combine_src.py')
logger.setLevel(logging.INFO)
import os
import re
import json
import optparse
import logging
# Include path for finding include files which are amalgamated.
include_paths = []
@ -112,14 +118,14 @@ def addAutomaticUndefs(f):
for line in f.lines:
m = re_def.match(line.data)
if m is not None:
#print('DEFINED: %s' % repr(m.group(1)))
#logger.debug('DEFINED: %s' % repr(m.group(1)))
defined[m.group(1)] = True
m = re_undef.match(line.data)
if m is not None:
# Could just ignore #undef's here: we'd then emit
# reliable #undef's (though maybe duplicates) at
# the end.
#print('UNDEFINED: %s' % repr(m.group(1)))
#logger.debug('UNDEFINED: %s' % repr(m.group(1)))
if defined.has_key(m.group(1)):
del defined[m.group(1)]
@ -130,10 +136,11 @@ def addAutomaticUndefs(f):
keys = sorted(defined.keys()) # deterministic order
if len(keys) > 0:
#print('STILL DEFINED: %r' % repr(defined.keys()))
#logger.debug('STILL DEFINED: %r' % repr(defined.keys()))
f.lines.append(Line(f.filename, len(f.lines) + 1, ''))
f.lines.append(Line(f.filename, len(f.lines) + 1, '/* automatic undefs */'))
for k in keys:
logger.debug('automatic #undef for ' + k)
f.lines.append(Line(f.filename, len(f.lines) + 1, '#undef %s' % k))
def createCombined(files, prologue_filename, line_directives):
@ -173,7 +180,7 @@ def createCombined(files, prologue_filename, line_directives):
# source or an include file. #include directives are handled
# recursively.
def processFile(f):
#print('Process file: ' + f.filename)
logger.debug('Process file: ' + f.filename)
for line in f.lines:
if not line.data.startswith('#include'):
@ -204,11 +211,11 @@ def createCombined(files, prologue_filename, line_directives):
incfile = lookupInclude(incpath)
if incfile is not None:
#print('Include considered internal: %s -> %s' % (repr(line.data), repr(incfile)))
logger.debug('Include considered internal: %s -> %s' % (repr(line.data), repr(incfile)))
emit('/* #include %s */' % incpath)
processFile(readFile(incfile))
else:
#print('Include considered external: %s' % repr(line.data))
logger.debug('Include considered external: %s' % repr(line.data))
emit(line) # keep as is
for f in files:
@ -226,6 +233,8 @@ def main():
parser.add_option('--output-source', dest='output_source', help='Output source filename')
parser.add_option('--output-metadata', dest='output_metadata', help='Output metadata filename')
parser.add_option('--line-directives', dest='line_directives', action='store_true', default=False, help='Use #line directives in combined source')
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
(opts, args) = parser.parse_args()
assert(opts.include_paths is not None)
@ -234,12 +243,18 @@ def main():
assert(opts.output_source)
assert(opts.output_metadata)
# Log level.
if opts.quiet:
logger.setLevel(logging.WARNING)
elif opts.verbose:
logger.setLevel(logging.DEBUG)
# Read input files, add automatic #undefs
sources = args
files = []
for fn in sources:
res = readFile(fn)
#print('Add automatic undefs for: ' + fn)
logger.debug('Add automatic undefs for: ' + fn)
addAutomaticUndefs(res)
files.append(res)
@ -250,7 +265,7 @@ def main():
with open(opts.output_metadata, 'wb') as f:
f.write(json.dumps(metadata, indent=4))
print('Combined %d source files, %d bytes written to %s' % (len(files), len(combined_source), opts.output_source))
logger.info('Combined %d source files, %d bytes written to %s' % (len(files), len(combined_source), opts.output_source))
if __name__ == '__main__':
main()

103
tools/configure.py

@ -13,8 +13,13 @@
# scripts.
#
import os
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('configure.py')
logger.setLevel(logging.INFO)
import os
import re
import shutil
import glob
@ -41,13 +46,13 @@ def exec_get_stdout(cmd, input=None, default=None, print_stdout=False):
sys.stdout.write(ret[1]) # print stderr on error
sys.stdout.flush()
if default is not None:
print('WARNING: command %r failed, return default' % cmd)
logger.info('WARNING: command %r failed, return default' % cmd)
return default
raise Exception('command failed, return code %d: %r' % (proc.returncode, cmd))
return ret[0]
except:
if default is not None:
print('WARNING: command %r failed, return default' % cmd)
logger.info('WARNING: command %r failed, return default' % cmd)
return default
raise
@ -102,14 +107,14 @@ def read_file(src, strip_last_nl=False):
def delete_matching_files(dirpath, cb):
for fn in os.listdir(dirpath):
if os.path.isfile(os.path.join(dirpath, fn)) and cb(fn):
#print('Deleting %r' % os.path.join(dirpath, fn))
logger.debug('Deleting %r' % os.path.join(dirpath, fn))
os.unlink(os.path.join(dirpath, fn))
def create_targz(dstfile, filelist):
# https://docs.python.org/2/library/tarfile.html#examples
def _add(tf, fn): # recursive add
#print('Adding to tar: ' + fn)
logger.debug('Adding to tar: ' + fn)
if os.path.isdir(fn):
for i in sorted(os.listdir(fn)):
_add(tf, os.path.join(fn, i))
@ -220,7 +225,7 @@ def main():
# Options for configure.py tool itself.
parser.add_option('--source-directory', dest='source_directory', default=None, help='Directory with raw input sources (src-input/)')
parser.add_option('--output-directory', dest='output_directory', default=None, help='Directory for output files, must already exist')
parser.add_option('--output-directory', dest='output_directory', default=None, help='Directory for output files (created automatically, must not exist)')
parser.add_option('--git-commit', dest='git_commit', default=None, help='Force git commit hash')
parser.add_option('--git-describe', dest='git_describe', default=None, help='Force git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Force git branch name')
@ -233,7 +238,8 @@ def main():
# Options forwarded to genbuiltins.py.
parser.add_option('--rom-support', dest='rom_support', action='store_true', help='Add support for ROM strings/objects (increases duktape.c size considerably)')
parser.add_option('--rom-auto-lightfunc', dest='rom_auto_lightfunc', action='store_true', default=False, help='Convert ROM built-in function properties into lightfuncs automatically whenever possible')
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', metavar='FILENAME', action='append', default=[], help='User strings and objects to add, YAML format (can be repeated for multiple overrides)')
parser.add_option('--user-builtin-metadata', dest='obsolete_builtin_metadata', default=None, help=optparse.SUPPRESS_HELP)
parser.add_option('--builtin-file', dest='builtin_files', metavar='FILENAME', action='append', default=[], help='Built-in string/object YAML metadata to be applied over default built-ins (multiple files may be given, applied in sequence)')
# Options for Unicode.
parser.add_option('--unicode-data', dest='unicode_data', default=None, help='Provide custom UnicodeData.txt')
@ -242,17 +248,35 @@ def main():
# Options forwarded to genconfig.py.
genconfig.add_genconfig_optparse_options(parser)
# Log level options.
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
(opts, args) = parser.parse_args()
assert(opts.source_directory)
srcdir = opts.source_directory
assert(opts.output_directory)
outdir = opts.output_directory
if os.path.exists(outdir):
raise Exception('configure target directory %s already exists, please delete first' % repr(outdir))
assert(opts.config_metadata)
if opts.obsolete_builtin_metadata is not None:
raise Exception('--user-builtin-metadata has been removed, use --builtin-file instead')
# Log level.
forward_loglevel = []
if opts.quiet:
logger.setLevel(logging.WARNING)
forward_loglevel = [ '--quiet' ]
elif opts.verbose:
logger.setLevel(logging.DEBUG)
forward_loglevel = [ '--verbose' ]
# Figure out directories, git info, etc
entry_pwd = os.getcwd()
entry_cwd = os.getcwd()
duk_dist_meta = None
if opts.duk_dist_meta is not None:
@ -279,13 +303,13 @@ def main():
git_branch = opts.git_branch
if git_commit is None:
print('Git commit not specified, autodetect from current directory')
logger.debug('Git commit not specified, autodetect from current directory')
git_commit = exec_get_stdout([ 'git', 'rev-parse', 'HEAD' ], default='external').strip()
if git_describe is None:
print('Git describe not specified, autodetect from current directory')
logger.debug('Git describe not specified, autodetect from current directory')
git_describe = exec_get_stdout([ 'git', 'describe', '--always', '--dirty' ], default='external').strip()
if git_branch is None:
print('Git branch not specified, autodetect from current directory')
logger.debug('Git branch not specified, autodetect from current directory')
git_branch = exec_get_stdout([ 'git', 'rev-parse', '--abbrev-ref', 'HEAD' ], default='external').strip()
git_commit = str(git_commit)
@ -305,14 +329,17 @@ def main():
else:
special_casing = opts.special_casing
print('Configuring Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch))
logger.info('Configuring Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch))
# Create output directory.
os.mkdir(outdir)
# Temporary directory.
tempdir = tempfile.mkdtemp(prefix='tmp-duk-prepare-')
atexit.register(shutil.rmtree, tempdir)
mkdir(os.path.join(tempdir, 'src'))
#print('Using temporary directory %r' % tempdir)
logger.debug('Using temporary directory %r' % tempdir)
# Separate sources are mostly copied as is at present.
copy_files([
@ -490,8 +517,8 @@ def main():
cmd += forward_genconfig_options()
cmd += [
'duk-config-header'
]
#print(repr(cmd))
] + forward_loglevel
logger.debug(repr(cmd))
exec_print_stdout(cmd)
copy_file(os.path.join(tempdir, 'duk_config.h.tmp'), os.path.join(outdir, 'duk_config.h'))
@ -530,7 +557,6 @@ def main():
# There are currently no profile specific variants of strings/builtins, but
# this will probably change when functions are added/removed based on profile.
# XXX: call as direct python
res = exec_get_stdout([
sys.executable,
os.path.join('tools', 'scan_used_stridx_bidx.py')
@ -541,7 +567,6 @@ def main():
with open(os.path.join(tempdir, 'duk_used_stridx_bidx_defs.json.tmp'), 'wb') as f:
f.write(res)
# XXX: call as direct python? does this need to work outside of configure.py?
cmd = [
sys.executable,
os.path.join('tools', 'genbuiltins.py'),
@ -564,16 +589,17 @@ def main():
if opts.rom_support:
# ROM string/object support is not enabled by default because
# it increases the generated duktape.c considerably.
print('Enabling --rom-support for genbuiltins.py')
logger.debug('Enabling --rom-support for genbuiltins.py')
cmd.append('--rom-support')
if opts.rom_auto_lightfunc:
print('Enabling --rom-auto-lightfunc for genbuiltins.py')
logger.debug('Enabling --rom-auto-lightfunc for genbuiltins.py')
cmd.append('--rom-auto-lightfunc')
for fn in opts.user_builtin_metadata:
print('Forwarding --user-builtin-metadata %s' % fn)
cmd.append('--user-builtin-metadata')
for fn in opts.builtin_files:
logger.debug('Forwarding --builtin-file %s' % fn)
cmd.append('--builtin-file')
cmd.append(fn)
#print(repr(cmd))
cmd += forward_loglevel
logger.debug(repr(cmd))
exec_print_stdout(cmd)
# Autogenerated Unicode files
@ -635,17 +661,17 @@ def main():
IDPART_MINUS_IDSTART_NOABMP_INCL=IDPART_MINUS_IDSTART_NOA_INCL
IDPART_MINUS_IDSTART_NOABMP_EXCL='Lu,Ll,Lt,Lm,Lo,Nl,0024,005F,ASCII,NONBMP'
print('Expand UnicodeData.txt ranges')
logger.debug('Expand UnicodeData.txt ranges')
exec_print_stdout([
sys.executable,
os.path.join('tools', 'prepare_unicode_data.py'),
'--unicode-data', unicode_data,
'--output', os.path.join(tempdir, 'UnicodeData-expanded.tmp')
])
] + forward_loglevel)
def extract_chars(incl, excl, suffix):
#print('- extract_chars: %s %s %s' % (incl, excl, suffix))
logger.debug('- extract_chars: %s %s %s' % (incl, excl, suffix))
res = exec_get_stdout([
sys.executable,
os.path.join('tools', 'extract_chars.py'),
@ -660,7 +686,7 @@ def main():
f.write(res)
def extract_caseconv():
#print('- extract_caseconv case conversion')
logger.debug('- extract_caseconv case conversion')
res = exec_get_stdout([
sys.executable,
os.path.join('tools', 'extract_caseconv.py'),
@ -675,7 +701,7 @@ def main():
with open(os.path.join(tempdir, 'caseconv.txt'), 'wb') as f:
f.write(res)
#print('- extract_caseconv canon lookup')
logger.debug('- extract_caseconv canon lookup')
res = exec_get_stdout([
sys.executable,
os.path.join('tools', 'extract_caseconv.py'),
@ -689,7 +715,13 @@ def main():
with open(os.path.join(tempdir, 'caseconv_re_canon_lookup.txt'), 'wb') as f:
f.write(res)
print('Create Unicode tables for codepoint classes')
# XXX: Now with configure.py part of the distributable, could generate
# only those Unicode tables needed by desired configuration (e.g. BMP-only
# tables if BMP-only was enabled).
# XXX: Improve Unicode preparation performance; it consumes most of the
# source preparation time.
logger.debug('Create Unicode tables for codepoint classes')
extract_chars(WHITESPACE_INCL, WHITESPACE_EXCL, 'ws')
extract_chars(LETTER_INCL, LETTER_EXCL, 'let')
extract_chars(LETTER_NOA_INCL, LETTER_NOA_EXCL, 'let_noa')
@ -704,10 +736,10 @@ def main():
extract_chars(IDPART_MINUS_IDSTART_NOA_INCL, IDPART_MINUS_IDSTART_NOA_EXCL, 'idp_m_ids_noa')
extract_chars(IDPART_MINUS_IDSTART_NOABMP_INCL, IDPART_MINUS_IDSTART_NOABMP_EXCL, 'idp_m_ids_noabmp')
print('Create Unicode tables for case conversion')
logger.debug('Create Unicode tables for case conversion')
extract_caseconv()
print('Combine sources and clean up')
logger.debug('Combine sources and clean up')
# Inject autogenerated files into source and header files so that they are
# usable (for all profiles and define cases) directly.
@ -815,8 +847,8 @@ def main():
files.append(fn)
res = map(lambda x: os.path.join(tempdir, 'src', x), files)
#print(repr(files))
#print(repr(res))
logger.debug(repr(files))
logger.debug(repr(res))
return res
if opts.separate_sources:
@ -839,6 +871,7 @@ def main():
if opts.line_directives:
cmd += [ '--line-directives' ]
cmd += select_combined_sources()
cmd += forward_loglevel
exec_print_stdout(cmd)
# Merge metadata files.
@ -867,7 +900,7 @@ def main():
with open(os.path.join(outdir, 'duk_source_meta.json'), 'wb') as f:
f.write(json.dumps(doc, indent=4))
print('Configure finished successfully')
logger.debug('Configure finished successfully')
if __name__ == '__main__':
main()

189
tools/genbuiltins.py

@ -18,8 +18,13 @@
# all supported alternatives.
#
import os
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('genbuiltins.py')
logger.setLevel(logging.INFO)
import os
import re
import traceback
import json
@ -28,6 +33,7 @@ import math
import struct
import optparse
import copy
import logging
import dukutil
@ -158,14 +164,14 @@ def metadata_remove_disabled(meta):
objlist = []
for o in meta['objects']:
if o.get('disable', False):
print('Remove disabled object: %s' % o['id'])
logger.debug('Remove disabled object: %s' % o['id'])
else:
objlist.append(o)
props = []
for p in o['properties']:
if p.get('disable', False):
print('Remove disabled property: %s, object: %s' % (p['key'], o['id']))
logger.debug('Remove disabled property: %s, object: %s' % (p['key'], o['id']))
else:
props.append(p)
@ -192,8 +198,8 @@ def metadata_delete_dangling_references_to_object(meta, obj_id):
# XXX: Should empty accessor (= no getter, no setter) be deleted?
# If so, beware of shorthand.
if delprop:
print('Deleted property %s of object %s, points to deleted object %s' % \
(p['key'], o['id'], obj_id))
logger.debug('Deleted property %s of object %s, points to deleted object %s' % \
(p['key'], o['id'], obj_id))
else:
new_p.append(p)
o['properties'] = new_p
@ -209,12 +215,12 @@ def metadata_merge_user_objects(meta, user_meta):
for o in user_meta.get('objects', []):
if o.get('disable', False):
print('Skip disabled object: %s' % o['id'])
logger.debug('Skip disabled object: %s' % o['id'])
continue
targ, targ_idx = metadata_lookup_object_and_index(meta, o['id'])
if o.get('delete', False):
print('Delete object: %s' % targ['id'])
logger.debug('Delete object: %s' % targ['id'])
if targ is None:
raise Exception('Cannot delete object %s which doesn\'t exist' % o['id'])
meta['objects'].pop(targ_idx)
@ -222,16 +228,16 @@ def metadata_merge_user_objects(meta, user_meta):
continue
if o.get('replace', False):
print('Replace object %s' % o['id'])
logger.debug('Replace object %s' % o['id'])
if targ is None:
print('WARNING: object to be replaced doesn\'t exist, append new object')
logger.warning('object to be replaced doesn\'t exist, append new object')
meta['objects'].append(o)
else:
meta['objects'][targ_idx] = o
continue
if o.get('add', False) or not o.get('modify', False): # 'add' is the default
print('Add object %s' % o['id'])
logger.debug('Add object %s' % o['id'])
if targ is not None:
raise Exception('Cannot add object %s which already exists' % o['id'])
meta['objects'].append(o)
@ -248,23 +254,23 @@ def metadata_merge_user_objects(meta, user_meta):
targ[k] = o[k]
for p in o.get('properties', []):
if p.get('disable', False):
print('Skip disabled property: %s' % p['key'])
logger.debug('Skip disabled property: %s' % p['key'])
continue
prop = None
prop_idx = None
prop, prop_idx = metadata_lookup_property_and_index(targ, p['key'])
if prop is not None:
if p.get('delete', False):
print('Delete property %s of %s' % (p['key'], o['id']))
logger.debug('Delete property %s of %s' % (p['key'], o['id']))
targ['properties'].pop(prop_idx)
else:
print('Replace property %s of %s' % (p['key'], o['id']))
logger.debug('Replace property %s of %s' % (p['key'], o['id']))
targ['properties'][prop_idx] = p
else:
if p.get('delete', False):
print('Deleting property %s of %s: doesn\'t exist, nop' % (p['key'], o['id']))
logger.debug('Deleting property %s of %s: doesn\'t exist, nop' % (p['key'], o['id']))
else:
print('Add property %s of %s' % (p['key'], o['id']))
logger.debug('Add property %s of %s' % (p['key'], o['id']))
targ['properties'].append(p)
# Normalize nargs for top level functions by defaulting 'nargs' from 'length'.
@ -278,7 +284,7 @@ def metadata_normalize_nargs_length(meta):
for p in o['properties']:
if p['key'] != 'length':
continue
#print('Default nargs for top level: %r' % p)
logger.debug('Default nargs for top level: %r' % p)
assert(isinstance(p['value'], int))
o['nargs'] = p['value']
break
@ -291,10 +297,10 @@ def metadata_normalize_nargs_length(meta):
continue
pval = p['value']
if not pval.has_key('length'):
print('Default length for function shorthand: %r' % p)
logger.debug('Default length for function shorthand: %r' % p)
pval['length'] = 0
if not pval.has_key('nargs'):
#print('Default nargs for function shorthand: %r' % p)
logger.debug('Default nargs for function shorthand: %r' % p)
pval['nargs'] = pval['length']
# Prepare a list of built-in objects which need a runtime 'bidx'.
@ -399,7 +405,7 @@ def metadata_normalize_shorthand(meta):
val['setter'])
def decodeStructuredValue(val):
#print('Decode structured value: %r' % val)
logger.debug('Decode structured value: %r' % val)
if isinstance(val, (int, long, float, str)):
return val # as is
elif isinstance(val, (dict)):
@ -417,7 +423,7 @@ def metadata_normalize_shorthand(meta):
# deterministic. User can always use longhand for exact
# property control.
#print('Decode structured object: %r' % val)
logger.debug('Decode structured object: %r' % val)
obj = getSubObject()
obj['class'] = 'Object'
obj['internal_prototype'] = 'bi_object_prototype'
@ -425,7 +431,7 @@ def metadata_normalize_shorthand(meta):
props = obj['properties']
keys = sorted(val.keys())
for k in keys:
#print('Decode property %s' % k)
logger.debug('Decode property %s' % k)
prop = { 'key': k, 'value': decodeStructuredValue(val[k]), 'attributes': 'wec' }
props.append(prop)
@ -451,7 +457,7 @@ def metadata_normalize_shorthand(meta):
# Date.prototype.toGMTString must point to the same Function object
# as Date.prototype.toUTCString, so special case hack it here.
if obj['id'] == 'bi_date_prototype' and val['key'] == 'toGMTString':
#print('Skip Date.prototype.toGMTString')
logger.debug('Skip Date.prototype.toGMTString')
continue
if isinstance(val['value'], dict) and val['value']['type'] == 'function':
@ -469,7 +475,7 @@ def metadata_normalize_shorthand(meta):
prop = clonePropShared(val)
prop['value'] = { 'type': 'accessor', 'getter_id': sub_getter['id'], 'setter_id': sub_setter['id'] }
assert('a' in prop['attributes']) # If missing, weird things happen runtime
#print('Expand accessor shorthand: %r -> %r' % (val, prop))
logger.debug('Expand accessor shorthand: %r -> %r' % (val, prop))
repl_props.append(prop)
elif isinstance(val['value'], dict) and val['value']['type'] == 'structured':
# Structured shorthand.
@ -477,7 +483,7 @@ def metadata_normalize_shorthand(meta):
prop = clonePropShared(val)
prop['value'] = subval
repl_props.append(prop)
print('Decoded structured shorthand for object %s, property %s' % (obj['id'], val['key']))
logger.debug('Decoded structured shorthand for object %s, property %s' % (obj['id'], val['key']))
elif isinstance(val['value'], dict) and val['value']['type'] == 'buffer':
# Duktape buffer type not yet supported.
raise Exception('Buffer type not yet supported for builtins: %r' % val)
@ -489,7 +495,7 @@ def metadata_normalize_shorthand(meta):
repl_props.append(val)
if obj['id'] == 'bi_date_prototype' and val['key'] == 'toUTCString':
#print('Clone Date.prototype.toUTCString to Date.prototype.toGMTString')
logger.debug('Clone Date.prototype.toUTCString to Date.prototype.toGMTString')
prop2 = copy.deepcopy(repl_props[-1])
prop2['key'] = 'toGMTString'
repl_props.append(prop2)
@ -502,7 +508,7 @@ def metadata_normalize_shorthand(meta):
meta['objects'] += subobjs
len_after = len(meta['objects'])
print('Normalized metadata shorthand, %d objects -> %d final objects' % (len_before, len_after))
logger.debug('Normalized metadata shorthand, %d objects -> %d final objects' % (len_before, len_after))
# Normalize property attribute order, default attributes, etc.
def metadata_normalize_property_attributes(meta):
@ -518,7 +524,7 @@ def metadata_normalize_property_attributes(meta):
attrs = 'ca' # accessor default is configurable
else:
attrs = 'wc' # default is writable, configurable
#print('Defaulted attributes of %s/%s to %s' % (o['id'], p['key'], attrs))
logger.debug('Defaulted attributes of %s/%s to %s' % (o['id'], p['key'], attrs))
# Decode flags to normalize their order in the end.
writable = 'w' in attrs
@ -528,7 +534,7 @@ def metadata_normalize_property_attributes(meta):
# Force 'accessor' attribute for accessors.
if is_accessor and not accessor:
#print('Property %s is accessor but has no "a" attribute, add attribute' % p['key'])
logger.debug('Property %s is accessor but has no "a" attribute, add attribute' % p['key'])
accessor = True
# Normalize order and write back.
@ -544,7 +550,7 @@ def metadata_normalize_property_attributes(meta):
p['attributes'] = attrs
if orig_attrs != attrs:
#print('Updated attributes of %s/%s from %r to %r' % (o['id'], p['key'], orig_attrs, attrs))
logger.debug('Updated attributes of %s/%s from %r to %r' % (o['id'], p['key'], orig_attrs, attrs))
pass
# Normalize ROM property attributes.
@ -570,11 +576,11 @@ def metadata_normalize_ram_function_names(meta):
break
if name_prop is None:
num_added += 1
#print('Adding missing "name" property for function %s' % o['id'])
logger.debug('Adding missing "name" property for function %s' % o['id'])
o['properties'].append({ 'key': 'name', 'value': '', 'attributes': '' })
if num_added > 0:
print('Added missing "name" property for %d functions' % num_added)
logger.debug('Added missing "name" property for %d functions' % num_added)
# Add a built-in objects list for RAM initialization.
def metadata_add_ram_filtered_object_list(meta):
@ -598,7 +604,7 @@ def metadata_add_ram_filtered_object_list(meta):
if keep:
objlist.append(o)
print('Filtered RAM object list: %d objects with bidx, %d total top level objects' % \
logger.debug('Filtered RAM object list: %d objects with bidx, %d total top level objects' % \
(len(meta['objects_bidx']), len(objlist)))
meta['objects_ram_toplevel'] = objlist
@ -619,20 +625,20 @@ def metadata_normalize_missing_strings(meta, user_meta):
for prop in obj['properties']:
key = prop['key']
if not strs_have.get(key):
#print('Add missing string: %r' % key)
logger.debug('Add missing string: %r' % key)
meta['strings'].append({ 'str': key, '_auto_add_ref': True })
strs_have[key] = True
if prop.has_key('value') and isinstance(prop['value'], (str, unicode)):
val = unicode_to_bytes(prop['value']) # XXX: should already be
val = unicode_to_bytes(prop['value']) # should already be, just in case
if not strs_have.get(val):
#print('Add missing string: %r' % val)
logger.debug('Add missing string: %r' % val)
meta['strings'].append({ 'str': val, '_auto_add_ref': True })
strs_have[val] = True
# Force user strings to be in ROM data.
for s in user_meta.get('add_forced_strings', []):
if not strs_have.get(s['str']):
#print('Add user string: %r' % s['str'])
logger.debug('Add user string: %r' % s['str'])
s['_auto_add_user'] = True
meta['strings'].append(s)
@ -661,14 +667,14 @@ def metadata_convert_lightfuncs(meta):
for p2 in targ['properties']:
# Don't convert if function has more properties than
# we're willing to sacrifice.
#print(' - Check %r . %s' % (o.get('id', None), p2['key']))
logger.debug(' - Check %r . %s' % (o.get('id', None), p2['key']))
if p2['key'] == 'length' and isinstance(p2['value'], (int, long)):
lf_len = p2['value']
if p2['key'] not in [ 'length', 'name' ]:
reasons.append('nonallowed-property')
if not p.get('autoLightfunc', True):
print('Automatic lightfunc conversion rejected for key %s, explicitly requested in metadata' % p['key'])
logger.debug('Automatic lightfunc conversion rejected for key %s, explicitly requested in metadata' % p['key'])
reasons.append('no-auto-lightfunc')
# lf_len comes from actual property table (after normalization)
@ -679,9 +685,9 @@ def metadata_convert_lightfuncs(meta):
# yet ready. If so, reject the lightfunc conversion
# for now. In practice this doesn't matter.
lf_magic = resolve_magic(targ.get('magic'), {}) # empty map is a "fake" bidx map
#print('resolved magic ok -> %r' % lf_magic)
logger.debug('resolved magic ok -> %r' % lf_magic)
except Exception, e:
#print('Failed to resolve magic for %r: %r' % (p['key'], e))
logger.debug('Failed to resolve magic for %r: %r' % (p['key'], e))
reasons.append('magic-resolve-failed')
lf_magic = 0xffffffff # dummy, will be out of bounds
else:
@ -694,17 +700,17 @@ def metadata_convert_lightfuncs(meta):
lf_varargs = False
if lf_len < 0 or lf_len > 15:
#print('lf_len out of bounds: %r' % lf_len)
logger.debug('lf_len out of bounds: %r' % lf_len)
reasons.append('len-bounds')
if lf_magic < -0x80 or lf_magic > 0x7f:
#print('lf_magic out of bounds: %r' % lf_magic)
logger.debug('lf_magic out of bounds: %r' % lf_magic)
reasons.append('magic-bounds')
if not lf_varargs and (lf_nargs < 0 or lf_nargs > 14):
#print('lf_nargs out of bounds: %r' % lf_nargs)
logger.debug('lf_nargs out of bounds: %r' % lf_nargs)
reasons.append('nargs-bounds')
if len(reasons) > 0:
#print('Don\'t convert to lightfunc: %r %r (%r): %r' % (o.get('id', None), p.get('key', None), p['value']['id'], reasons))
logger.debug('Don\'t convert to lightfunc: %r %r (%r): %r' % (o.get('id', None), p.get('key', None), p['value']['id'], reasons))
num_skipped += 1
continue
@ -717,11 +723,11 @@ def metadata_convert_lightfuncs(meta):
'nargs': lf_nargs,
'varargs': lf_varargs
}
#print(' - Convert to lightfunc: %r %r (%r) -> %r' % (o.get('id', None), p.get('key', None), p_id, p['value']))
logger.debug(' - Convert to lightfunc: %r %r (%r) -> %r' % (o.get('id', None), p.get('key', None), p_id, p['value']))
num_converted += 1
print('Converted %d built-in function properties to lightfuncs, %d skipped as non-eligible' % (num_converted, num_skipped))
logger.debug('Converted %d built-in function properties to lightfuncs, %d skipped as non-eligible' % (num_converted, num_skipped))
# Detect objects not reachable from any object with a 'bidx'. This is usually
# a user error because such objects can't be reached at runtime so they're
@ -757,8 +763,8 @@ def metadata_remove_orphan_objects(meta):
_markId(v.get('getter_id'))
_markId(v.get('setter_id'))
#print('Mark reachable: reachable count initially %d, now %d' % \
# (reachable_count, len(reachable.keys())))
logger.debug('Mark reachable: reachable count initially %d, now %d' % \
(reachable_count, len(reachable.keys())))
if reachable_count == len(reachable.keys()):
break
@ -768,14 +774,14 @@ def metadata_remove_orphan_objects(meta):
deleted = False
for i,o in enumerate(meta['objects']):
if not reachable.has_key(o['id']):
#print('WARNING: object %s not reachable, dropping' % o['id'])
logger.debug('object %s not reachable, dropping' % o['id'])
meta['objects'].pop(i)
deleted = True
num_deleted += 1
break
if num_deleted > 0:
print('Deleted %d unreachable objects' % num_deleted)
logger.debug('Deleted %d unreachable objects' % num_deleted)
# Add C define names for builtin strings. These defines are added to all
# strings, even when they won't get a stridx because the define names are
@ -930,7 +936,7 @@ def dump_metadata(meta, fn):
tmp = json.dumps(recursive_bytes_to_strings(meta), indent=4)
with open(fn, 'wb') as f:
f.write(tmp)
print('Wrote metadata dump to %s' % fn)
logger.debug('Wrote metadata dump to %s' % fn)
# Main metadata loading function: load metadata from multiple sources,
# merge and normalize, prepare various indexes etc.
@ -950,8 +956,8 @@ def load_metadata(opts, rom=False, build_info=None):
# Add user objects.
user_meta = {}
for fn in opts.user_builtin_metadata:
print('Merging user builtin metadata file %s' % fn)
for fn in opts.builtin_files:
logger.debug('Merging user builtin metadata file %s' % fn)
with open(fn, 'rb') as f:
user_meta = recursive_strings_to_bytes(yaml.load(f))
metadata_merge_user_objects(meta, user_meta)
@ -1021,7 +1027,7 @@ def load_metadata(opts, rom=False, build_info=None):
# into the string list (not the 'stridx' list though): all strings
# referenced by ROM objects must also be in ROM.
if rom:
for fn in opts.user_builtin_metadata:
for fn in opts.builtin_files:
# XXX: awkward second pass
with open(fn, 'rb') as f:
user_meta = recursive_strings_to_bytes(yaml.load(f))
@ -1132,12 +1138,12 @@ def load_metadata(opts, rom=False, build_info=None):
count_add_user += 1
count_add = count_add_ref + count_add_user
print(('Prepared %s metadata: %d objects, %d objects with bidx, ' + \
'%d strings, %d strings with stridx, %d strings added ' + \
'(%d property key references, %d user strings)') % \
(meta_name, len(meta['objects']), len(meta['objects_bidx']), \
len(meta['strings']), len(meta['strings_stridx']), \
count_add, count_add_ref, count_add_user))
logger.debug(('Prepared %s metadata: %d objects, %d objects with bidx, ' + \
'%d strings, %d strings with stridx, %d strings added ' + \
'(%d property key references, %d user strings)') % \
(meta_name, len(meta['objects']), len(meta['objects_bidx']), \
len(meta['strings']), len(meta['strings_stridx']), \
count_add, count_add_ref, count_add_user))
return meta
@ -1459,16 +1465,16 @@ def gen_ramstr_initdata_bitpacked(meta):
be.bits(SEVENBIT, 5)
be.bits(ord(c), 7)
n_sevenbit += 1
#print('sevenbit for: %r' % c)
#logger.debug('sevenbit for: %r' % c)
# end marker not necessary, C code knows length from define
res = be.getByteString()
print('%d ram strings, %d bytes of string init data, %d maximum string length, ' + \
'encoding: optimal=%d,switch1=%d,switch=%d,sevenbit=%d') % \
(len(meta['strings_stridx']), len(res), maxlen, \
n_optimal, n_switch1, n_switch, n_sevenbit)
logger.debug(('%d ram strings, %d bytes of string init data, %d maximum string length, ' + \
'encoding: optimal=%d,switch1=%d,switch=%d,sevenbit=%d') % \
(len(meta['strings_stridx']), len(res), maxlen, \
n_optimal, n_switch1, n_switch, n_sevenbit))
return res, maxlen
@ -1706,7 +1712,7 @@ def gen_ramobj_initdata_for_props(meta, be, bi, string_to_stridx, natfunc_name_t
if bi['id'] == 'bi_date_prototype':
prop_togmtstring = steal_prop(props, 'toGMTString')
assert(prop_togmtstring is not None)
#print('Stole Date.prototype.toGMTString')
logger.debug('Stole Date.prototype.toGMTString')
# Split properties into non-builtin functions and other properties.
# This split is a bit arbitrary, but is used to reduce flag bits in
@ -1741,14 +1747,14 @@ def gen_ramobj_initdata_for_props(meta, be, bi, string_to_stridx, natfunc_name_t
attrs = valspec.get('attributes', default_attrs)
attrs = attrs.replace('a', '') # ram bitstream doesn't encode 'accessor' attribute
if attrs != default_attrs:
#print('non-default attributes: %s -> %r (default %r)' % (valspec['key'], attrs, default_attrs))
logger.debug('non-default attributes: %s -> %r (default %r)' % (valspec['key'], attrs, default_attrs))
be.bits(1, 1) # flag: have custom attributes
be.bits(encode_property_flags(attrs), PROP_FLAGS_BITS)
else:
be.bits(0, 1) # flag: no custom attributes
if val is None:
print('WARNING: RAM init data format doesn\'t support "null" now, value replaced with "undefined": %r' % valspec)
logger.warning('RAM init data format doesn\'t support "null" now, value replaced with "undefined": %r' % valspec)
#raise Exception('RAM init format doesn\'t support a "null" value now')
be.bits(PROP_TYPE_UNDEFINED, PROP_TYPE_BITS)
elif isinstance(val, bool):
@ -1776,7 +1782,7 @@ def gen_ramobj_initdata_for_props(meta, be, bi, string_to_stridx, natfunc_name_t
data = ''.join([ val[indexlist[idx]] for idx in xrange(8) ])
#print('DOUBLE: %s -> %s' % (val.encode('hex'), data.encode('hex')))
logger.debug('DOUBLE: %s -> %s' % (val.encode('hex'), data.encode('hex')))
if len(data) != 8:
raise Exception('internal error')
@ -1819,7 +1825,7 @@ def gen_ramobj_initdata_for_props(meta, be, bi, string_to_stridx, natfunc_name_t
assert(getter_fn['magic'] == 0)
assert(setter_fn['magic'] == 0)
elif val['type'] == 'lightfunc':
print('WARNING: RAM init data format doesn\'t support "lightfunc" now, value replaced with "undefined": %r' % valspec)
logger.warning('RAM init data format doesn\'t support "lightfunc" now, value replaced with "undefined": %r' % valspec)
be.bits(PROP_TYPE_UNDEFINED, PROP_TYPE_BITS)
else:
raise Exception('unsupported value: %s' % repr(val))
@ -1920,10 +1926,10 @@ def gen_ramobj_initdata_bitpacked(meta, native_funcs, natfunc_name_to_natidx, do
count_function_props += count_obj_func
romobj_init_data = be.getByteString()
#print(repr(romobj_init_data))
#print(len(romobj_init_data))
#logger.debug(repr(romobj_init_data))
#logger.debug(len(romobj_init_data))
print('%d ram builtins, %d normal properties, %d function properties, %d bytes of object init data' % \
logger.debug('%d ram builtins, %d normal properties, %d function properties, %d bytes of object init data' % \
(count_builtins, count_normal_props, count_function_props, len(romobj_init_data)))
return romobj_init_data
@ -2177,7 +2183,6 @@ def rom_emit_strings_source(genc, meta):
if blen == clen:
flags.append('DUK_HSTRING_FLAG_ASCII')
if is_arridx:
#print('%r is arridx' % v)
flags.append('DUK_HSTRING_FLAG_ARRIDX')
if len(v) >= 1 and v[0] == '\xff':
flags.append('DUK_HSTRING_FLAG_INTERNAL')
@ -2736,8 +2741,8 @@ def rom_emit_objects(genc, meta, bi_str_map):
genc.emitLine('};')
genc.emitLine('#endif')
print('%d compressed rom pointers (used range is [0x%04x,0x%04x], %d space left)' % \
(len(romptr_compress_list), ROMPTR_FIRST, romptr_highest, 0xffff - romptr_highest))
logger.debug('%d compressed rom pointers (used range is [0x%04x,0x%04x], %d space left)' % \
(len(romptr_compress_list), ROMPTR_FIRST, romptr_highest, 0xffff - romptr_highest))
# Undefine helpers.
genc.emitLine('')
@ -2801,7 +2806,7 @@ def emit_header_native_function_declarations(genc, meta):
if isinstance(v, dict) and v['type'] == 'lightfunc':
assert(v.has_key('native'))
_emit(v['native'])
#print('Lightfunc function declaration: %r' % v['native'])
logger.debug('Lightfunc function declaration: %r' % v['native'])
for fname in funclist:
# Visibility depends on whether the function is Duktape internal or user.
@ -2821,10 +2826,13 @@ def main():
parser.add_option('--git-describe', dest='git_describe', default=None, help='Git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Git branch name')
parser.add_option('--duk-version', dest='duk_version', default=None, help='Duktape version (e.g. 10203)')
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
parser.add_option('--used-stridx-metadata', dest='used_stridx_metadata', help='DUK_STRIDX_xxx used by source/headers, JSON format')
parser.add_option('--strings-metadata', dest='strings_metadata', help='Built-in strings metadata file, YAML format')
parser.add_option('--objects-metadata', dest='objects_metadata', help='Built-in objects metadata file, YAML format')
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help='User strings and objects to add, YAML format (can be repeated for multiple overrides)')
parser.add_option('--strings-metadata', dest='strings_metadata', help='Default built-in strings metadata file, YAML format')
parser.add_option('--objects-metadata', dest='objects_metadata', help='Default built-in objects metadata file, YAML format')
parser.add_option('--user-builtin-metadata', dest='obsolete_builtin_metadata', default=None, help=optparse.SUPPRESS_HELP)
parser.add_option('--builtin-file', dest='builtin_files', metavar='FILENAME', action='append', default=[], help='Built-in string/object YAML metadata to be applied over default built-ins (multiple files may be given, applied in sequence)')
parser.add_option('--ram-support', dest='ram_support', action='store_true', default=False, help='Support RAM strings/objects')
parser.add_option('--rom-support', dest='rom_support', action='store_true', default=False, help='Support ROM strings/objects (increases output size considerably)')
parser.add_option('--rom-auto-lightfunc', dest='rom_auto_lightfunc', action='store_true', default=False, help='Convert ROM built-in function properties into lightfuncs automatically whenever possible')
@ -2835,6 +2843,15 @@ def main():
parser.add_option('--dev-dump-final-rom-metadata', dest='dev_dump_final_rom_metadata', help='Development option')
(opts, args) = parser.parse_args()
if opts.obsolete_builtin_metadata is not None:
raise Exception('--user-builtin-metadata has been removed, use --builtin-file instead')
# Log level.
if opts.quiet:
logger.setLevel(logging.WARNING)
elif opts.verbose:
logger.setLevel(logging.DEBUG)
# Options processing.
build_info = {
@ -2844,6 +2861,15 @@ def main():
'duk_version': int(opts.duk_version),
}
desc = []
if opts.ram_support:
desc += [ 'ram built-in support' ]
if opts.rom_support:
desc += [ 'rom built-in support' ]
if opts.rom_auto_lightfunc:
desc += [ 'rom auto lightfunc' ]
logger.info('Creating built-in initialization data: ' + ', '.join(desc))
# Read in metadata files, normalizing and merging as necessary.
ram_meta = load_metadata(opts, rom=False, build_info=build_info)
@ -2958,9 +2984,11 @@ def main():
with open(opts.out_source, 'wb') as f:
f.write(gc_src.getString())
logger.debug('Wrote built-ins source to ' + opts.out_source)
with open(opts.out_header, 'wb') as f:
f.write(gc_hdr.getString())
logger.debug('Wrote built-ins header to ' + opts.out_header)
# Write a JSON file with build metadata, e.g. built-in strings.
@ -2990,6 +3018,7 @@ def main():
with open(opts.out_metadata_json, 'wb') as f:
f.write(json.dumps(meta, indent=4, sort_keys=True, ensure_ascii=True))
logger.debug('Wrote built-ins metadata to ' + opts.out_metadata_json)
if __name__ == '__main__':
main()

101
tools/genconfig.py

@ -21,8 +21,13 @@
# modular sources.
#
import os
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('genconfig.py')
logger.setLevel(logging.INFO)
import os
import re
import json
import yaml
@ -31,6 +36,7 @@ import tarfile
import tempfile
import atexit
import shutil
import logging
try:
from StringIO import StringIO
except ImportError:
@ -72,6 +78,7 @@ allowed_use_meta_keys = [
'default',
'tags',
'description',
'warn_if_missing'
]
required_opt_meta_keys = [
'define',
@ -149,7 +156,7 @@ compiler_required_provides = [
def get_auto_delete_tempdir():
tmpdir = tempfile.mkdtemp(suffix='-genconfig')
def _f(dirname):
#print('Deleting temporary directory: %r' % dirname)
logger.debug('Deleting temporary directory: %r' % dirname)
if os.path.isdir(dirname) and '-genconfig' in dirname:
shutil.rmtree(dirname)
atexit.register(_f, tmpdir)
@ -204,7 +211,8 @@ class Snippet:
self.requires[k] = True
stripped_lines = strip_comments_from_lines(lines)
# for line in stripped_lines: print(line)
#for line in stripped_lines:
# logger.debug(line)
for line in stripped_lines:
# Careful with order, snippet may self-reference its own
@ -222,7 +230,7 @@ class Snippet:
if m is not None and '/* redefine */' not in line and \
len(m.group(1)) > 0 and m.group(1)[-1] != '_':
# Don't allow e.g. DUK_USE_ which results from matching DUK_USE_xxx
#print('PROVIDES: %r' % m.group(1))
#logger.debug('PROVIDES: %r' % m.group(1))
self.provides[m.group(1)] = True
if autoscan_requires:
matches = re.findall(re_line_requires, line)
@ -240,7 +248,7 @@ class Snippet:
# Snippet provides it's own require; omit
pass
else:
#print('REQUIRES: %r' % m)
#logger.debug('REQUIRES: %r' % m)
self.requires[m] = True
def fromFile(cls, filename):
@ -253,7 +261,7 @@ class Snippet:
m = re.match(r'#snippet\s+"(.*?)"', line)
# XXX: better plumbing for lookup path
sub_fn = os.path.normpath(os.path.join(filename, '..', '..', 'header-snippets', m.group(1)))
#print('#snippet ' + sub_fn)
logger.debug('#snippet ' + sub_fn)
sn = Snippet.fromFile(sub_fn)
lines += sn.lines
else:
@ -392,7 +400,7 @@ def fill_dependencies_for_snippets(snippets, idx_deps):
found = True # at least one other node provides 'k'
if not found:
#print('Resolving %r' % k)
logger.debug('Resolving %r' % k)
resolved.append(k)
# Find a header snippet which provides the missing define.
@ -405,7 +413,7 @@ def fill_dependencies_for_snippets(snippets, idx_deps):
sn_req = sn2
break
if sn_req is None:
print(repr(sn.lines))
logger.debug(repr(sn.lines))
raise Exception('cannot resolve missing require: %r' % k)
# Snippet may have further unresolved provides; add recursively
@ -452,15 +460,15 @@ def fill_dependencies_for_snippets(snippets, idx_deps):
for sn in snlist:
if handled.has_key(sn):
continue
print('UNHANDLED KEY')
print('PROVIDES: %r' % sn.provides)
print('REQUIRES: %r' % sn.requires)
print('\n'.join(sn.lines))
logger.debug('UNHANDLED KEY')
logger.debug('PROVIDES: %r' % sn.provides)
logger.debug('REQUIRES: %r' % sn.requires)
logger.debug('\n'.join(sn.lines))
# print(repr(graph))
# print(repr(snlist))
# print('Resolved helper defines: %r' % resolved)
# print('Resolved %d helper defines' % len(resolved))
#logger.debug(repr(graph))
#logger.debug(repr(snlist))
logger.debug('Resolved helper defines: %r' % resolved)
logger.debug('Resolved %d helper defines' % len(resolved))
def serialize_snippet_list(snippets):
ret = []
@ -476,7 +484,7 @@ def serialize_snippet_list(snippets):
for k in sn.requires.keys():
if not emitted_provides.has_key(k):
# XXX: conditional warning, happens in some normal cases
#print('WARNING: define %r required, not provided so far' % k)
logger.warning('define %r required, not provided so far' % k)
pass
return '\n'.join(ret)
@ -510,15 +518,15 @@ def scan_use_defs(dirname):
if doc.get('example', False):
continue
if doc.get('unimplemented', False):
print('WARNING: unimplemented: %s' % fn)
logger.warning('unimplemented: %s' % fn)
continue
dockeys = doc.keys()
for k in dockeys:
if not k in allowed_use_meta_keys:
print('WARNING: unknown key %s in metadata file %s' % (k, fn))
logger.warning('unknown key %s in metadata file %s' % (k, fn))
for k in required_use_meta_keys:
if not k in dockeys:
print('WARNING: missing key %s in metadata file %s' % (k, fn))
logger.warning('missing key %s in metadata file %s' % (k, fn))
use_defs[doc['define']] = doc
@ -541,15 +549,15 @@ def scan_opt_defs(dirname):
if doc.get('example', False):
continue
if doc.get('unimplemented', False):
print('WARNING: unimplemented: %s' % fn)
logger.warning('unimplemented: %s' % fn)
continue
dockeys = doc.keys()
for k in dockeys:
if not k in allowed_opt_meta_keys:
print('WARNING: unknown key %s in metadata file %s' % (k, fn))
logger.warning('unknown key %s in metadata file %s' % (k, fn))
for k in required_opt_meta_keys:
if not k in dockeys:
print('WARNING: missing key %s in metadata file %s' % (k, fn))
logger.warning('missing key %s in metadata file %s' % (k, fn))
opt_defs[doc['define']] = doc
@ -582,7 +590,7 @@ def scan_helper_snippets(dirname): # DUK_F_xxx snippets
for fn in os.listdir(dirname):
if (fn[0:6] != 'DUK_F_'):
continue
#print('Autoscanning snippet: %s' % fn)
logger.debug('Autoscanning snippet: %s' % fn)
helper_snippets.append(Snippet.fromFile(os.path.join(dirname, fn)))
def get_opt_defs(removed=True, deprecated=True, unused=True):
@ -671,7 +679,7 @@ def get_tag_list_with_preferred_order(preferred):
if tag not in tags:
tags.append(tag)
#print('Effective tag order: %r' % tags)
logger.debug('Effective tag order: %r' % tags)
return tags
def rst_format(text):
@ -803,11 +811,11 @@ def get_forced_options(opts):
if use_defs.has_key(k):
pass # key is known
else:
print('WARNING: option override key %s not defined in metadata, ignoring' % k)
logger.warning('option override key %s not defined in metadata, ignoring' % k)
forced_opts[k] = doc[k] # shallow copy
if len(forced_opts.keys()) > 0:
print('Overrides: %s' % json.dumps(forced_opts))
logger.debug('Overrides: %s' % json.dumps(forced_opts))
return forced_opts
@ -968,11 +976,11 @@ def add_feature_option_handling(opts, ret, forced_opts, already_provided_keys):
# For some options like DUK_OPT_PACKED_TVAL the default comes
# from platform definition.
if doc.get('feature_no_default', False):
print('Skip default for option %s' % config_define)
logger.debug('Skip default for option %s' % config_define)
ret.line('/* Already provided above */')
elif already_provided_keys.has_key(config_define):
# This is a fallback in case config option metadata is wrong.
print('Skip default for option %s (already provided but not flagged in metadata!)' % config_define)
logger.debug('Skip default for option %s (already provided but not flagged in metadata!)' % config_define)
ret.line('/* Already provided above */')
else:
emit_default_from_config_meta(ret, doc, forced_opts, undef_done)
@ -1034,7 +1042,12 @@ def generate_duk_config_header(opts, meta_dir):
ret = FileBuilder(base_dir=os.path.join(meta_dir, 'header-snippets'), \
use_cpp_warning=opts.use_cpp_warning)
# Parse forced options. Warn about missing forced options when it is
# strongly recommended that the option is provided.
forced_opts = get_forced_options(opts)
for doc in use_defs_list:
if doc.get('warn_if_missing', False) and not forced_opts.has_key(doc['define']):
logger.warning('Recommended config option ' + doc['define'] + ' not provided')
platforms = None
with open(os.path.join(meta_dir, 'platforms.yaml'), 'rb') as f:
@ -1302,7 +1315,7 @@ def generate_duk_config_header(opts, meta_dir):
# Automatic DUK_OPT_xxx feature option handling
if opts.support_feature_options:
print('Autogenerating feature option (DUK_OPT_xxx) support')
logger.debug('Autogenerating feature option (DUK_OPT_xxx) support')
tmp = Snippet(ret.join().split('\n'))
add_feature_option_handling(opts, ret, forced_opts, tmp.provides)
@ -1353,7 +1366,7 @@ def generate_duk_config_header(opts, meta_dir):
ret.chdr_block_heading('Autogenerated defaults')
for k in need_keys:
#print('config option %s not covered by manual snippets, emitting default automatically' % k)
logger.debug('config option %s not covered by manual snippets, emitting default automatically' % k)
emit_default_from_config_meta(ret, use_defs[k], {}, False)
ret.empty()
@ -1471,6 +1484,8 @@ def add_genconfig_optparse_options(parser, direct=False):
parser.add_option('--git-commit', dest='git_commit', default=None, help='git commit hash to be included in header comments')
parser.add_option('--git-describe', dest='git_describe', default=None, help='git describe string to be included in header comments')
parser.add_option('--git-branch', dest='git_branch', default=None, help='git branch string to be included in header comments')
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
def parse_options():
commands = [
@ -1491,6 +1506,12 @@ def parse_options():
return opts, args
def genconfig(opts, args):
# Log level.
if opts.quiet:
logger.setLevel(logging.WARNING)
elif opts.verbose:
logger.setLevel(logging.DEBUG)
meta_dir = opts.config_metadata
if opts.config_metadata is None:
if os.path.isdir(os.path.join('.', 'config-options')):
@ -1506,9 +1527,9 @@ def genconfig(opts, args):
scan_opt_defs(os.path.join(meta_dir, 'feature-options'))
scan_use_tags()
scan_tags_meta(os.path.join(meta_dir, 'tags.yaml'))
print('%s, scanned %d DUK_OPT_xxx, %d DUK_USE_XXX, %d helper snippets' % \
logger.debug('%s, scanned %d DUK_OPT_xxx, %d DUK_USE_XXX, %d helper snippets' % \
(metadata_src_text, len(opt_defs.keys()), len(use_defs.keys()), len(helper_snippets)))
#print('Tags: %r' % use_tags_list)
logger.debug('Tags: %r' % use_tags_list)
if len(args) == 0:
raise Exception('missing command')
@ -1518,18 +1539,30 @@ def genconfig(opts, args):
# Generate a duk_config.h header with platform, compiler, and
# architecture either autodetected (default) or specified by
# user. Support for autogenerated DUK_OPT_xxx flags is also
# selected by user.
desc = [
'platform=' + ('any', opts.platform)[opts.platform is not None],
'architecture=' + ('any', opts.architecture)[opts.architecture is not None],
'compiler=' + ('any', opts.compiler)[opts.compiler is not None]
]
if opts.dll:
desc.append('dll mode')
logger.info('Creating duk_config.h: ' + ', '.join(desc))
result = generate_duk_config_header(opts, meta_dir)
with open(opts.output, 'wb') as f:
f.write(result)
logger.debug('Wrote duk_config.h to ' + str(opts.output))
elif cmd == 'feature-documentation':
logger.info('Creating feature option documentation')
result = generate_feature_option_documentation(opts)
with open(opts.output, 'wb') as f:
f.write(result)
logger.debug('Wrote feature option documentation to ' + str(opts.output))
elif cmd == 'config-documentation':
logger.info('Creating config option documentation')
result = generate_config_option_documentation(opts)
with open(opts.output, 'wb') as f:
f.write(result)
logger.debug('Wrote config option documentation to ' + str(opts.output))
else:
raise Exception('invalid command: %r' % cmd)

20
tools/merge_debug_meta.py

@ -3,7 +3,15 @@
# Merge debugger YAML metadata files and output a merged JSON metadata file.
#
import os, sys, json, yaml
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('merge_debug_meta.py')
logger.setLevel(logging.INFO)
import os
import json
import yaml
import optparse
if __name__ == '__main__':
@ -13,8 +21,16 @@ if __name__ == '__main__':
parser.add_option('--debug-commands', dest='debug_commands', help='YAML metadata for debug commands')
parser.add_option('--debug-errors', dest='debug_errors', help='YAML metadata for debug protocol error codes')
parser.add_option('--opcodes', dest='opcodes', help='YAML metadata for opcodes')
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
(opts, args) = parser.parse_args()
# Log level.
if opts.quiet:
logger.setLevel(logging.WARNING)
elif opts.verbose:
logger.setLevel(logging.DEBUG)
res = {}
def merge(fn):
with open(fn, 'rb') as f:
@ -29,4 +45,4 @@ if __name__ == '__main__':
with open(opts.output, 'wb') as f:
f.write(json.dumps(res, indent=4) + '\n')
print('Wrote merged debugger metadata to ' + str(opts.output))
logger.debug('Wrote merged debugger metadata to ' + str(opts.output))

2
tools/prepare_unicode_data.py

@ -12,6 +12,8 @@ def main():
parser = optparse.OptionParser()
parser.add_option('--unicode-data', dest='unicode_data')
parser.add_option('--output', dest='output')
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
(opts, args) = parser.parse_args()
assert(opts.unicode_data is not None)
assert(opts.output is not None)

108
util/dist.py

@ -4,15 +4,20 @@
# of this directory can then be packaged into a source distributable.
#
import os
import logging
import sys
logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(name)-21s %(levelname)-7s %(message)s')
logger = logging.getLogger('dist.py')
logger.setLevel(logging.INFO)
import os
import re
import json
import shutil
import glob
import optparse
import subprocess
import tarfile
import logging
# Helpers.
@ -27,13 +32,13 @@ def exec_get_stdout(cmd, input=None, default=None, print_stdout=False):
sys.stdout.write(ret[1]) # print stderr on error
sys.stdout.flush()
if default is not None:
print('WARNING: command %r failed, return default' % cmd)
logger.warning(' command %r failed, return default' % cmd)
return default
raise Exception('command failed, return code %d: %r' % (proc.returncode, cmd))
return ret[0]
except:
if default is not None:
print('WARNING: command %r failed, return default' % cmd)
logger.warning('command %r failed, return default' % cmd)
return default
raise
@ -88,26 +93,9 @@ def read_file(src, strip_last_nl=False):
def delete_matching_files(dirpath, cb):
for fn in os.listdir(dirpath):
if os.path.isfile(os.path.join(dirpath, fn)) and cb(fn):
#print('Deleting %r' % os.path.join(dirpath, fn))
logger.debug('Deleting %r' % os.path.join(dirpath, fn))
os.unlink(os.path.join(dirpath, fn))
def create_targz(dstfile, filelist):
# https://docs.python.org/2/library/tarfile.html#examples
def _add(tf, fn): # recursive add
#print('Adding to tar: ' + fn)
if os.path.isdir(fn):
for i in sorted(os.listdir(fn)):
_add(tf, os.path.join(fn, i))
elif os.path.isfile(fn):
tf.add(fn)
else:
raise Exception('invalid file: %r' % fn)
with tarfile.open(dstfile, 'w:gz') as tf:
for fn in filelist:
_add(tf, fn)
def glob_files(pattern):
return glob.glob(pattern)
@ -133,13 +121,10 @@ def get_duk_version():
raise Exception('cannot figure out duktape version')
def create_dist_directories(dist):
if os.path.isdir(dist):
shutil.rmtree(dist)
if os.path.exists(dist):
raise Exception('dist target directory %s already exists, please delete first' % repr(dist))
mkdir(dist)
mkdir(os.path.join(dist, 'src-input'))
mkdir(os.path.join(dist, 'src-separate'))
mkdir(os.path.join(dist, 'src'))
mkdir(os.path.join(dist, 'src-noline'))
mkdir(os.path.join(dist, 'tools'))
mkdir(os.path.join(dist, 'config'))
mkdir(os.path.join(dist, 'extras'))
@ -189,13 +174,17 @@ def check_cwd_duktape_repo_root():
def parse_options():
parser = optparse.OptionParser()
parser.add_option('--create-spdx', dest='create_spdx', action='store_true', default=False, help='Create SPDX license file')
parser.add_option('--repo-directory', dest='repo_directory', default=None, help='Duktape repo directory (default is CWD)')
parser.add_option('--output-directory', dest='output_directory', default=None, help='Dist output directory (created automatically, must not exist; default is <repo>/dist)')
parser.add_option('--git-commit', dest='git_commit', default=None, help='Force git commit hash')
parser.add_option('--git-describe', dest='git_describe', default=None, help='Force git describe')
parser.add_option('--git-branch', dest='git_branch', default=None, help='Force git branch name')
parser.add_option('--create-spdx', dest='create_spdx', action='store_true', default=False, help='Create SPDX license file')
parser.add_option('--rom-support', dest='rom_support', action='store_true', help=optparse.SUPPRESS_HELP)
parser.add_option('--rom-auto-lightfunc', dest='rom_auto_lightfunc', action='store_true', default=False, help=optparse.SUPPRESS_HELP)
parser.add_option('--user-builtin-metadata', dest='user_builtin_metadata', action='append', default=[], help=optparse.SUPPRESS_HELP)
parser.add_option('--quiet', dest='quiet', action='store_true', default=False, help='Suppress info messages (show warnings)')
parser.add_option('--verbose', dest='verbose', action='store_true', default=False, help='Show verbose debug messages')
(opts, args) = parser.parse_args()
return opts, args
@ -239,18 +228,43 @@ def main():
# Basic option parsing, Python module check, CWD check.
opts, args = parse_options()
# Log level.
forward_loglevel = []
if opts.quiet:
logger.setLevel(logging.WARNING)
forward_loglevel = [ '--quiet' ]
elif opts.verbose:
logger.setLevel(logging.DEBUG)
forward_loglevel = [ '--verbose' ]
check_python_modules(opts)
check_cwd_duktape_repo_root()
if opts.repo_directory is None:
opts.repo_directory = os.path.abspath('.')
logger.info('No --repo-directory option, defaulting to current directory %s' % opts.repo_directory)
check_cwd_duktape_repo_root()
opts.repo_directory = os.path.abspath(opts.repo_directory)
logger.debug('Using repo directory: %s' % opts.repo_directory)
if opts.output_directory is None:
opts.output_directory = os.path.abspath(os.path.join(opts.repo_directory, 'dist'))
logger.info('No --output-directory option, defaulting to repo/dist directory %s' % opts.output_directory)
opts.output_directory = os.path.abspath(opts.output_directory)
logger.debug('Using output directory: %s' % opts.output_directory)
# Obsolete options check.
if opts.rom_support or opts.rom_auto_lightfunc or len(opts.user_builtin_metadata) > 0:
if opts.rom_support or opts.rom_auto_lightfunc:
raise Exception('obsolete ROM support argument(s), use tools/configure.py instead')
if len(opts.user_builtin_metadata) > 0:
raise Exception('obsolete --user-builtin-metadata argument, use tools/configure.py and --builtin-file instead')
# Figure out directories, git info, Duktape version, etc.
entry_pwd = os.getcwd()
dist = os.path.join(entry_pwd, 'dist')
entry_cwd = os.getcwd()
dist = opts.output_directory
os.chdir(opts.repo_directory)
duk_version, duk_major, duk_minor, duk_patch, duk_version_formatted = get_duk_version()
@ -271,16 +285,17 @@ def main():
git_describe_cstring = cstring(git_describe)
git_branch_cstring = cstring(git_branch)
print('Dist for Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch))
logger.info('Dist for Duktape version %s, commit %s, describe %s, branch %s' % \
(duk_version_formatted, git_commit, git_describe, git_branch))
# Create dist directory structure, copy files.
print('Create dist directories and copy static files')
logger.debug('Create dist directories and copy static files')
os.chdir(opts.repo_directory)
create_dist_directories(dist)
os.chdir(entry_pwd)
os.chdir(opts.repo_directory)
copy_files([
'builtins.yaml',
@ -724,7 +739,7 @@ def main():
'--debug-commands', os.path.join('debugger', 'duk_debugcommands.yaml'),
'--debug-errors', os.path.join('debugger', 'duk_debugerrors.yaml'),
'--opcodes', os.path.join('debugger', 'duk_opcodes.yaml')
])
] + forward_loglevel)
# Add a build metadata file.
@ -743,7 +758,7 @@ def main():
# Build prepared sources (src/, src-noline/, src-separate/) with default
# config. This is done using tools and metadata in the dist directory.
print('Create prepared sources for default configuration')
logger.debug('Create prepared sources for default configuration')
def prep_default_sources(dirname, extraopts):
cmd = [
@ -763,6 +778,7 @@ def main():
for i in opts.user_builtin_metadata:
cmd.append('--user-builtin-metadata')
cmd.append(i)
cmd += forward_loglevel
exec_print_stdout(cmd)
prep_default_sources('src', [ '--line-directives' ])
@ -776,7 +792,7 @@ def main():
# Create SPDX license once all other files are in place (and cleaned).
if opts.create_spdx:
print('Create SPDX license')
logger.debug('Create SPDX license')
try:
exec_get_stdout([
sys.executable,
@ -784,15 +800,15 @@ def main():
os.path.join(dist, 'license.spdx')
])
except:
print('')
print('***')
print('*** WARNING: Failed to create SPDX license, this should not happen for an official release!')
print('***')
print('')
logger.warning('')
logger.warning('***')
logger.warning('*** WARNING: Failed to create SPDX license, this should not happen for an official release!')
logger.warning('***')
logger.warning('')
else:
print('Skip SPDX license creation')
logger.debug('Skip SPDX license creation')
print('Dist finished successfully')
logger.info('Dist finished successfully')
if __name__ == '__main__':
main()

12
util/example_rombuild.sh

@ -14,11 +14,11 @@ make clean dist
rm -rf dist/src dist/src-noline dist/src-separate
$PYTHON dist/tools/configure.py \
--source-directory dist/src-input \
--output-directory dist \
--output-directory dist/src \
--rom-support \
--rom-auto-lightfunc \
--user-builtin-metadata util/example_user_builtins1.yaml \
--user-builtin-metadata util/example_user_builtins2.yaml \
--builtin-file util/example_user_builtins1.yaml \
--builtin-file util/example_user_builtins2.yaml \
--config-metadata dist/config \
-DDUK_USE_ROM_STRINGS \
-DDUK_USE_ROM_OBJECTS \
@ -36,11 +36,11 @@ make duk dukd # XXX: currently fails to start, DUK_CMDLINE_LOGGING_SUPPORT, DUK
rm -rf dist/src dist/src-noline dist/src-separate
$PYTHON dist/tools/configure.py \
--source-directory dist/src-input \
--output-directory dist \
--output-directory dist/src \
--rom-support \
--rom-auto-lightfunc \
--user-builtin-metadata util/example_user_builtins1.yaml \
--user-builtin-metadata util/example_user_builtins2.yaml \
--builtin-file util/example_user_builtins1.yaml \
--builtin-file util/example_user_builtins2.yaml \
--config-metadata dist/config \
--support-feature-options \
-DDUK_USE_ROM_STRINGS \

Loading…
Cancel
Save