[NEW] Add Jsoncpp component.
git-svn-id: svn://svn.tuxfamily.org/svnroot/notepadplus/repository/trunk@1248 f5eea248-9336-0410-98b8-ebc06183d4e3
This commit is contained in:
parent
908d12a061
commit
e995a13b38
1
PowerEditor/src/jsoncpp/AUTHORS
Normal file
1
PowerEditor/src/jsoncpp/AUTHORS
Normal file
@ -0,0 +1 @@
|
|||||||
|
Baptiste Lepilleur <blep@users.sourceforge.net>
|
1
PowerEditor/src/jsoncpp/LICENSE
Normal file
1
PowerEditor/src/jsoncpp/LICENSE
Normal file
@ -0,0 +1 @@
|
|||||||
|
The json-cpp library and this documentation are in Public Domain.
|
117
PowerEditor/src/jsoncpp/README.txt
Normal file
117
PowerEditor/src/jsoncpp/README.txt
Normal file
@ -0,0 +1,117 @@
|
|||||||
|
* Introduction:
|
||||||
|
=============
|
||||||
|
|
||||||
|
JSON (JavaScript Object Notation) is a lightweight data-interchange format.
|
||||||
|
It can represent integer, real number, string, an ordered sequence of
|
||||||
|
value, and a collection of name/value pairs.
|
||||||
|
|
||||||
|
JsonCpp is a simple API to manipulate JSON value, handle serialization
|
||||||
|
and unserialization to string.
|
||||||
|
|
||||||
|
It can also preserve existing comment in unserialization/serialization steps,
|
||||||
|
making it a convenient format to store user input files.
|
||||||
|
|
||||||
|
Unserialization parsing is user friendly and provides precise error reports.
|
||||||
|
|
||||||
|
|
||||||
|
* Building/Testing:
|
||||||
|
=================
|
||||||
|
|
||||||
|
JsonCpp uses Scons (http://www.scons.org) as a build system. Scons requires
|
||||||
|
python to be installed (http://www.python.org).
|
||||||
|
|
||||||
|
You download scons-local distribution from the following url:
|
||||||
|
http://sourceforge.net/project/showfiles.php?group_id=30337&package_id=67375
|
||||||
|
|
||||||
|
Unzip it in the directory where you found this README file. scons.py Should be
|
||||||
|
at the same level as README.
|
||||||
|
|
||||||
|
python scons.py platform=PLTFRM [TARGET]
|
||||||
|
where PLTFRM may be one of:
|
||||||
|
suncc Sun C++ (Solaris)
|
||||||
|
vacpp Visual Age C++ (AIX)
|
||||||
|
mingw
|
||||||
|
msvc6 Microsoft Visual Studio 6 service pack 5-6
|
||||||
|
msvc70 Microsoft Visual Studio 2002
|
||||||
|
msvc71 Microsoft Visual Studio 2003
|
||||||
|
msvc80 Microsoft Visual Studio 2005
|
||||||
|
linux-gcc Gnu C++ (linux, also reported to work for Mac OS X)
|
||||||
|
|
||||||
|
adding platform is fairly simple. You need to change the Sconstruct file
|
||||||
|
to do so.
|
||||||
|
|
||||||
|
and TARGET may be:
|
||||||
|
check: build library and run unit tests.
|
||||||
|
|
||||||
|
|
||||||
|
* Running the test manually:
|
||||||
|
==========================
|
||||||
|
|
||||||
|
cd test
|
||||||
|
# This will run the Reader/Writer tests
|
||||||
|
python runjsontests.py "path to jsontest.exe"
|
||||||
|
|
||||||
|
# This will run the Reader/Writer tests, using JSONChecker test suite
|
||||||
|
# (http://www.json.org/JSON_checker/).
|
||||||
|
# Notes: not all tests pass: JsonCpp is too lenient (for example,
|
||||||
|
# it allows an integer to start with '0'). The goal is to improve
|
||||||
|
# strict mode parsing to get all tests to pass.
|
||||||
|
python runjsontests.py --with-json-checker "path to jsontest.exe"
|
||||||
|
|
||||||
|
# This will run the unit tests (mostly Value)
|
||||||
|
python rununittests.py "path to test_lib_json.exe"
|
||||||
|
|
||||||
|
You can run the tests using valgrind:
|
||||||
|
python rununittests.py --valgrind "path to test_lib_json.exe"
|
||||||
|
|
||||||
|
|
||||||
|
* Building the documentation:
|
||||||
|
===========================
|
||||||
|
|
||||||
|
Run the python script doxybuild.py from the top directory:
|
||||||
|
|
||||||
|
python doxybuild.py --open --with-dot
|
||||||
|
|
||||||
|
See doxybuild.py --help for options.
|
||||||
|
|
||||||
|
|
||||||
|
* Adding a reader/writer test:
|
||||||
|
============================
|
||||||
|
|
||||||
|
To add a test, you need to create two files in test/data:
|
||||||
|
- a TESTNAME.json file, that contains the input document in JSON format.
|
||||||
|
- a TESTNAME.expected file, that contains a flatened representation of
|
||||||
|
the input document.
|
||||||
|
|
||||||
|
TESTNAME.expected file format:
|
||||||
|
- each line represents a JSON element of the element tree represented
|
||||||
|
by the input document.
|
||||||
|
- each line has two parts: the path to access the element separated from
|
||||||
|
the element value by '='. Array and object values are always empty
|
||||||
|
(e.g. represented by either [] or {}).
|
||||||
|
- element path: '.' represented the root element, and is used to separate
|
||||||
|
object members. [N] is used to specify the value of an array element
|
||||||
|
at index N.
|
||||||
|
See test_complex_01.json and test_complex_01.expected to better understand
|
||||||
|
element path.
|
||||||
|
|
||||||
|
|
||||||
|
* Understanding reader/writer test output:
|
||||||
|
========================================
|
||||||
|
|
||||||
|
When a test is run, output files are generated aside the input test files.
|
||||||
|
Below is a short description of the content of each file:
|
||||||
|
|
||||||
|
- test_complex_01.json: input JSON document
|
||||||
|
- test_complex_01.expected: flattened JSON element tree used to check if
|
||||||
|
parsing was corrected.
|
||||||
|
|
||||||
|
- test_complex_01.actual: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.json
|
||||||
|
- test_complex_01.rewrite: JSON document written by jsontest.exe using the
|
||||||
|
Json::Value parsed from test_complex_01.json and serialized using
|
||||||
|
Json::StyledWritter.
|
||||||
|
- test_complex_01.actual-rewrite: flattened JSON element tree produced by
|
||||||
|
jsontest.exe from reading test_complex_01.rewrite.
|
||||||
|
test_complex_01.process-output: jsontest.exe output, typically useful to
|
||||||
|
understand parsing error.
|
235
PowerEditor/src/jsoncpp/SConstruct
Normal file
235
PowerEditor/src/jsoncpp/SConstruct
Normal file
@ -0,0 +1,235 @@
|
|||||||
|
"""
|
||||||
|
Notes:
|
||||||
|
- shared library support is buggy: it assumes that a static and dynamic library can be build from the same object files. This is not true on many platforms. For this reason it is only enabled on linux-gcc at the current time.
|
||||||
|
|
||||||
|
To add a platform:
|
||||||
|
- add its name in options allowed_values below
|
||||||
|
- add tool initialization for this platform. Search for "if platform == 'suncc'" as an example.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
import sys
|
||||||
|
|
||||||
|
JSONCPP_VERSION = open(File('#version').abspath,'rt').read().strip()
|
||||||
|
DIST_DIR = '#dist'
|
||||||
|
|
||||||
|
options = Variables()
|
||||||
|
options.Add( EnumVariable('platform',
|
||||||
|
'Platform (compiler/stl) used to build the project',
|
||||||
|
'msvc71',
|
||||||
|
allowed_values='suncc vacpp mingw msvc6 msvc7 msvc71 msvc80 linux-gcc'.split(),
|
||||||
|
ignorecase=2) )
|
||||||
|
|
||||||
|
try:
|
||||||
|
platform = ARGUMENTS['platform']
|
||||||
|
if platform == 'linux-gcc':
|
||||||
|
CXX = 'g++' # not quite right, but env is not yet available.
|
||||||
|
import commands
|
||||||
|
version = commands.getoutput('%s -dumpversion' %CXX)
|
||||||
|
platform = 'linux-gcc-%s' %version
|
||||||
|
print "Using platform '%s'" %platform
|
||||||
|
LD_LIBRARY_PATH = os.environ.get('LD_LIBRARY_PATH', '')
|
||||||
|
LD_LIBRARY_PATH = "%s:libs/%s" %(LD_LIBRARY_PATH, platform)
|
||||||
|
os.environ['LD_LIBRARY_PATH'] = LD_LIBRARY_PATH
|
||||||
|
print "LD_LIBRARY_PATH =", LD_LIBRARY_PATH
|
||||||
|
except KeyError:
|
||||||
|
print 'You must specify a "platform"'
|
||||||
|
sys.exit(2)
|
||||||
|
|
||||||
|
print "Building using PLATFORM =", platform
|
||||||
|
|
||||||
|
rootbuild_dir = Dir('#buildscons')
|
||||||
|
build_dir = os.path.join( '#buildscons', platform )
|
||||||
|
bin_dir = os.path.join( '#bin', platform )
|
||||||
|
lib_dir = os.path.join( '#libs', platform )
|
||||||
|
sconsign_dir_path = Dir(build_dir).abspath
|
||||||
|
sconsign_path = os.path.join( sconsign_dir_path, '.sconsign.dbm' )
|
||||||
|
|
||||||
|
# Ensure build directory exist (SConsignFile fail otherwise!)
|
||||||
|
if not os.path.exists( sconsign_dir_path ):
|
||||||
|
os.makedirs( sconsign_dir_path )
|
||||||
|
|
||||||
|
# Store all dependencies signature in a database
|
||||||
|
SConsignFile( sconsign_path )
|
||||||
|
|
||||||
|
def make_environ_vars():
|
||||||
|
"""Returns a dictionnary with environment variable to use when compiling."""
|
||||||
|
# PATH is required to find the compiler
|
||||||
|
# TEMP is required for at least mingw
|
||||||
|
vars = {}
|
||||||
|
for name in ('PATH', 'TEMP', 'TMP'):
|
||||||
|
if name in os.environ:
|
||||||
|
vars[name] = os.environ[name]
|
||||||
|
return vars
|
||||||
|
|
||||||
|
|
||||||
|
env = Environment( ENV = make_environ_vars(),
|
||||||
|
toolpath = ['scons-tools'],
|
||||||
|
tools=[] ) #, tools=['default'] )
|
||||||
|
|
||||||
|
if platform == 'suncc':
|
||||||
|
env.Tool( 'sunc++' )
|
||||||
|
env.Tool( 'sunlink' )
|
||||||
|
env.Tool( 'sunar' )
|
||||||
|
env.Append( CCFLAGS = ['-mt'] )
|
||||||
|
elif platform == 'vacpp':
|
||||||
|
env.Tool( 'default' )
|
||||||
|
env.Tool( 'aixcc' )
|
||||||
|
env['CXX'] = 'xlC_r' #scons does not pick-up the correct one !
|
||||||
|
# using xlC_r ensure multi-threading is enabled:
|
||||||
|
# http://publib.boulder.ibm.com/infocenter/pseries/index.jsp?topic=/com.ibm.vacpp7a.doc/compiler/ref/cuselect.htm
|
||||||
|
env.Append( CCFLAGS = '-qrtti=all',
|
||||||
|
LINKFLAGS='-bh:5' ) # -bh:5 remove duplicate symbol warning
|
||||||
|
elif platform == 'msvc6':
|
||||||
|
env['MSVS_VERSION']='6.0'
|
||||||
|
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
|
||||||
|
env.Tool( tool )
|
||||||
|
env['CXXFLAGS']='-GR -GX /nologo /MT'
|
||||||
|
elif platform == 'msvc70':
|
||||||
|
env['MSVS_VERSION']='7.0'
|
||||||
|
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
|
||||||
|
env.Tool( tool )
|
||||||
|
env['CXXFLAGS']='-GR -GX /nologo /MT'
|
||||||
|
elif platform == 'msvc71':
|
||||||
|
env['MSVS_VERSION']='7.1'
|
||||||
|
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
|
||||||
|
env.Tool( tool )
|
||||||
|
env['CXXFLAGS']='-GR -GX /nologo /MT'
|
||||||
|
elif platform == 'msvc80':
|
||||||
|
env['MSVS_VERSION']='8.0'
|
||||||
|
for tool in ['msvc', 'msvs', 'mslink', 'masm', 'mslib']:
|
||||||
|
env.Tool( tool )
|
||||||
|
env['CXXFLAGS']='-GR -EHsc /nologo /MT'
|
||||||
|
elif platform == 'mingw':
|
||||||
|
env.Tool( 'mingw' )
|
||||||
|
env.Append( CPPDEFINES=[ "WIN32", "NDEBUG", "_MT" ] )
|
||||||
|
elif platform.startswith('linux-gcc'):
|
||||||
|
env.Tool( 'default' )
|
||||||
|
env.Append( LIBS = ['pthread'], CCFLAGS = "-Wall" )
|
||||||
|
env['SHARED_LIB_ENABLED'] = True
|
||||||
|
else:
|
||||||
|
print "UNSUPPORTED PLATFORM."
|
||||||
|
env.Exit(1)
|
||||||
|
|
||||||
|
env.Tool('targz')
|
||||||
|
env.Tool('srcdist')
|
||||||
|
env.Tool('globtool')
|
||||||
|
|
||||||
|
env.Append( CPPPATH = ['#include'],
|
||||||
|
LIBPATH = lib_dir )
|
||||||
|
short_platform = platform
|
||||||
|
if short_platform.startswith('msvc'):
|
||||||
|
short_platform = short_platform[2:]
|
||||||
|
# Notes: on Windows you need to rebuild the source for each variant
|
||||||
|
# Build script does not support that yet so we only build static libraries.
|
||||||
|
# This also fails on AIX because both dynamic and static library ends with
|
||||||
|
# extension .a.
|
||||||
|
env['SHARED_LIB_ENABLED'] = env.get('SHARED_LIB_ENABLED', False)
|
||||||
|
env['LIB_PLATFORM'] = short_platform
|
||||||
|
env['LIB_LINK_TYPE'] = 'lib' # static
|
||||||
|
env['LIB_CRUNTIME'] = 'mt'
|
||||||
|
env['LIB_NAME_SUFFIX'] = '${LIB_PLATFORM}_${LIB_LINK_TYPE}${LIB_CRUNTIME}' # must match autolink naming convention
|
||||||
|
env['JSONCPP_VERSION'] = JSONCPP_VERSION
|
||||||
|
env['BUILD_DIR'] = env.Dir(build_dir)
|
||||||
|
env['ROOTBUILD_DIR'] = env.Dir(rootbuild_dir)
|
||||||
|
env['DIST_DIR'] = DIST_DIR
|
||||||
|
if 'TarGz' in env['BUILDERS']:
|
||||||
|
class SrcDistAdder:
|
||||||
|
def __init__( self, env ):
|
||||||
|
self.env = env
|
||||||
|
def __call__( self, *args, **kw ):
|
||||||
|
apply( self.env.SrcDist, (self.env['SRCDIST_TARGET'],) + args, kw )
|
||||||
|
env['SRCDIST_BUILDER'] = env.TarGz
|
||||||
|
else: # If tarfile module is missing
|
||||||
|
class SrcDistAdder:
|
||||||
|
def __init__( self, env ):
|
||||||
|
pass
|
||||||
|
def __call__( self, *args, **kw ):
|
||||||
|
pass
|
||||||
|
env['SRCDIST_ADD'] = SrcDistAdder( env )
|
||||||
|
env['SRCDIST_TARGET'] = os.path.join( DIST_DIR, 'jsoncpp-src-%s.tar.gz' % env['JSONCPP_VERSION'] )
|
||||||
|
|
||||||
|
env_testing = env.Clone( )
|
||||||
|
env_testing.Append( LIBS = ['json_${LIB_NAME_SUFFIX}'] )
|
||||||
|
|
||||||
|
def buildJSONExample( env, target_sources, target_name ):
|
||||||
|
env = env.Clone()
|
||||||
|
env.Append( CPPPATH = ['#'] )
|
||||||
|
exe = env.Program( target=target_name,
|
||||||
|
source=target_sources )
|
||||||
|
env['SRCDIST_ADD']( source=[target_sources] )
|
||||||
|
global bin_dir
|
||||||
|
return env.Install( bin_dir, exe )
|
||||||
|
|
||||||
|
def buildJSONTests( env, target_sources, target_name ):
|
||||||
|
jsontests_node = buildJSONExample( env, target_sources, target_name )
|
||||||
|
check_alias_target = env.Alias( 'check', jsontests_node, RunJSONTests( jsontests_node, jsontests_node ) )
|
||||||
|
env.AlwaysBuild( check_alias_target )
|
||||||
|
|
||||||
|
def buildUnitTests( env, target_sources, target_name ):
|
||||||
|
jsontests_node = buildJSONExample( env, target_sources, target_name )
|
||||||
|
check_alias_target = env.Alias( 'check', jsontests_node,
|
||||||
|
RunUnitTests( jsontests_node, jsontests_node ) )
|
||||||
|
env.AlwaysBuild( check_alias_target )
|
||||||
|
|
||||||
|
def buildLibrary( env, target_sources, target_name ):
|
||||||
|
static_lib = env.StaticLibrary( target=target_name + '_${LIB_NAME_SUFFIX}',
|
||||||
|
source=target_sources )
|
||||||
|
global lib_dir
|
||||||
|
env.Install( lib_dir, static_lib )
|
||||||
|
if env['SHARED_LIB_ENABLED']:
|
||||||
|
shared_lib = env.SharedLibrary( target=target_name + '_${LIB_NAME_SUFFIX}',
|
||||||
|
source=target_sources )
|
||||||
|
env.Install( lib_dir, shared_lib )
|
||||||
|
env['SRCDIST_ADD']( source=[target_sources] )
|
||||||
|
|
||||||
|
Export( 'env env_testing buildJSONExample buildLibrary buildJSONTests buildUnitTests' )
|
||||||
|
|
||||||
|
def buildProjectInDirectory( target_directory ):
|
||||||
|
global build_dir
|
||||||
|
target_build_dir = os.path.join( build_dir, target_directory )
|
||||||
|
target = os.path.join( target_directory, 'sconscript' )
|
||||||
|
SConscript( target, build_dir=target_build_dir, duplicate=0 )
|
||||||
|
env['SRCDIST_ADD']( source=[target] )
|
||||||
|
|
||||||
|
|
||||||
|
def runJSONTests_action( target, source = None, env = None ):
|
||||||
|
# Add test scripts to python path
|
||||||
|
jsontest_path = Dir( '#test' ).abspath
|
||||||
|
sys.path.insert( 0, jsontest_path )
|
||||||
|
data_path = os.path.join( jsontest_path, 'data' )
|
||||||
|
import runjsontests
|
||||||
|
return runjsontests.runAllTests( os.path.abspath(source[0].path), data_path )
|
||||||
|
|
||||||
|
def runJSONTests_string( target, source = None, env = None ):
|
||||||
|
return 'RunJSONTests("%s")' % source[0]
|
||||||
|
|
||||||
|
import SCons.Action
|
||||||
|
ActionFactory = SCons.Action.ActionFactory
|
||||||
|
RunJSONTests = ActionFactory(runJSONTests_action, runJSONTests_string )
|
||||||
|
|
||||||
|
def runUnitTests_action( target, source = None, env = None ):
|
||||||
|
# Add test scripts to python path
|
||||||
|
jsontest_path = Dir( '#test' ).abspath
|
||||||
|
sys.path.insert( 0, jsontest_path )
|
||||||
|
import rununittests
|
||||||
|
return rununittests.runAllTests( os.path.abspath(source[0].path) )
|
||||||
|
|
||||||
|
def runUnitTests_string( target, source = None, env = None ):
|
||||||
|
return 'RunUnitTests("%s")' % source[0]
|
||||||
|
|
||||||
|
RunUnitTests = ActionFactory(runUnitTests_action, runUnitTests_string )
|
||||||
|
|
||||||
|
env.Alias( 'check' )
|
||||||
|
|
||||||
|
srcdist_cmd = env['SRCDIST_ADD']( source = """
|
||||||
|
AUTHORS README.txt SConstruct
|
||||||
|
""".split() )
|
||||||
|
env.Alias( 'src-dist', srcdist_cmd )
|
||||||
|
|
||||||
|
buildProjectInDirectory( 'src/jsontestrunner' )
|
||||||
|
buildProjectInDirectory( 'src/lib_json' )
|
||||||
|
buildProjectInDirectory( 'src/test_lib_json' )
|
||||||
|
#print env.Dump()
|
||||||
|
|
Binary file not shown.
Binary file not shown.
1534
PowerEditor/src/jsoncpp/doc/doxyfile.in
Normal file
1534
PowerEditor/src/jsoncpp/doc/doxyfile.in
Normal file
File diff suppressed because it is too large
Load Diff
23
PowerEditor/src/jsoncpp/doc/footer.html
Normal file
23
PowerEditor/src/jsoncpp/doc/footer.html
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
<hr>
|
||||||
|
<table width="100%">
|
||||||
|
<tr>
|
||||||
|
<td width="10%" align="left" valign="center">
|
||||||
|
<a href="http://sourceforge.net">
|
||||||
|
<img
|
||||||
|
src="http://sourceforge.net/sflogo.php?group_id=144446"
|
||||||
|
width="88" height="31" border="0" alt="SourceForge Logo"></a>
|
||||||
|
</td>
|
||||||
|
<td width="20%" align="left" valign="center">
|
||||||
|
hosts this site.
|
||||||
|
</td>
|
||||||
|
<td>
|
||||||
|
</td>
|
||||||
|
<td align="right" valign="center">
|
||||||
|
Send comments to:<br>
|
||||||
|
<a href="mailto:jsoncpp-devel@lists.sourceforge.net">Json-cpp Developers</a>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
</table>
|
||||||
|
|
||||||
|
</body>
|
||||||
|
</html>
|
24
PowerEditor/src/jsoncpp/doc/header.html
Normal file
24
PowerEditor/src/jsoncpp/doc/header.html
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>
|
||||||
|
JsonCpp - JSON data format manipulation library
|
||||||
|
</title>
|
||||||
|
<link href="doxygen.css" rel="stylesheet" type="text/css">
|
||||||
|
<link href="tabs.css" rel="stylesheet" type="text/css">
|
||||||
|
</head>
|
||||||
|
|
||||||
|
<body bgcolor="#ffffff">
|
||||||
|
<table width="100%">
|
||||||
|
<tr>
|
||||||
|
<td width="40%" align="left" valign="center">
|
||||||
|
<a href="http://sourceforge.net/projects/jsoncpp">
|
||||||
|
JsonCpp project page
|
||||||
|
</a>
|
||||||
|
</td>
|
||||||
|
<td width="40%" align="right" valign="center">
|
||||||
|
<a href="http://jsoncpp.sourceforge.net">JsonCpp home page</a>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
</table>
|
||||||
|
|
||||||
|
<hr>
|
116
PowerEditor/src/jsoncpp/doc/jsoncpp.dox
Normal file
116
PowerEditor/src/jsoncpp/doc/jsoncpp.dox
Normal file
@ -0,0 +1,116 @@
|
|||||||
|
/**
|
||||||
|
\mainpage
|
||||||
|
\section _intro Introduction
|
||||||
|
|
||||||
|
<a HREF="http://www.json.org/">JSON (JavaScript Object Notation)</a>
|
||||||
|
is a lightweight data-interchange format.
|
||||||
|
It can represent integer, real number, string, an ordered sequence of value, and
|
||||||
|
a collection of name/value pairs.
|
||||||
|
|
||||||
|
Here is an example of JSON data:
|
||||||
|
\verbatim
|
||||||
|
// Configuration options
|
||||||
|
{
|
||||||
|
// Default encoding for text
|
||||||
|
"encoding" : "UTF-8",
|
||||||
|
|
||||||
|
// Plug-ins loaded at start-up
|
||||||
|
"plug-ins" : [
|
||||||
|
"python",
|
||||||
|
"c++",
|
||||||
|
"ruby"
|
||||||
|
],
|
||||||
|
|
||||||
|
// Tab indent size
|
||||||
|
"indent" : { "length" : 3, "use_space" = true }
|
||||||
|
}
|
||||||
|
\endverbatim
|
||||||
|
|
||||||
|
\section _features Features
|
||||||
|
- read and write JSON document
|
||||||
|
- attach C and C++ style comments to element during parsing
|
||||||
|
- rewrite JSON document preserving original comments
|
||||||
|
|
||||||
|
Notes: Comments used to be supported in JSON but where removed for
|
||||||
|
portability (C like comments are not supported in Python). Since
|
||||||
|
comments are useful in configuration/input file, this feature was
|
||||||
|
preserved.
|
||||||
|
|
||||||
|
\section _example Code example
|
||||||
|
|
||||||
|
\code
|
||||||
|
Json::Value root; // will contains the root value after parsing.
|
||||||
|
Json::Reader reader;
|
||||||
|
bool parsingSuccessful = reader.parse( config_doc, root );
|
||||||
|
if ( !parsingSuccessful )
|
||||||
|
{
|
||||||
|
// report to the user the failure and their locations in the document.
|
||||||
|
std::cout << "Failed to parse configuration\n"
|
||||||
|
<< reader.getFormatedErrorMessages();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the value of the member of root named 'encoding', return 'UTF-8' if there is no
|
||||||
|
// such member.
|
||||||
|
std::string encoding = root.get("encoding", "UTF-8" ).asString();
|
||||||
|
// Get the value of the member of root named 'encoding', return a 'null' value if
|
||||||
|
// there is no such member.
|
||||||
|
const Json::Value plugins = root["plug-ins"];
|
||||||
|
for ( int index = 0; index < plugins.size(); ++index ) // Iterates over the sequence elements.
|
||||||
|
loadPlugIn( plugins[index].asString() );
|
||||||
|
|
||||||
|
setIndentLength( root["indent"].get("length", 3).asInt() );
|
||||||
|
setIndentUseSpace( root["indent"].get("use_space", true).asBool() );
|
||||||
|
|
||||||
|
// ...
|
||||||
|
// At application shutdown to make the new configuration document:
|
||||||
|
// Since Json::Value has implicit constructor for all value types, it is not
|
||||||
|
// necessary to explicitly construct the Json::Value object:
|
||||||
|
root["encoding"] = getCurrentEncoding();
|
||||||
|
root["indent"]["length"] = getCurrentIndentLength();
|
||||||
|
root["indent"]["use_space"] = getCurrentIndentUseSpace();
|
||||||
|
|
||||||
|
Json::StyledWriter writer;
|
||||||
|
// Make a new JSON document for the configuration. Preserve original comments.
|
||||||
|
std::string outputConfig = writer.write( root );
|
||||||
|
|
||||||
|
// You can also use streams. This will put the contents of any JSON
|
||||||
|
// stream at a particular sub-value, if you'd like.
|
||||||
|
std::cin >> root["subtree"];
|
||||||
|
|
||||||
|
// And you can write to a stream, using the StyledWriter automatically.
|
||||||
|
std::cout << root;
|
||||||
|
\endcode
|
||||||
|
|
||||||
|
\section _plinks Build instructions
|
||||||
|
The build instructions are located in the file
|
||||||
|
<a HREF="README.txt">README.txt</a> in the top-directory of the project.
|
||||||
|
|
||||||
|
Permanent link to the latest revision of the file in subversion:
|
||||||
|
<a HREF="http://svn.sourceforge.net/viewcvs.cgi/jsoncpp/README.txt?view=markup">latest README.txt</a>
|
||||||
|
|
||||||
|
\section _pdownload Download
|
||||||
|
The sources can be downloaded from
|
||||||
|
<a HREF="http://sourceforge.net/projects/jsoncpp/files/">SourceForge download page</a>.
|
||||||
|
|
||||||
|
The latest version of the source is available in the project's subversion repository:
|
||||||
|
<a HREF="http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/">
|
||||||
|
http://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/trunk/</a>
|
||||||
|
|
||||||
|
To checkout the source, see the following
|
||||||
|
<a HREF="http://sourceforge.net/scm/?type=svn&group_id=144446">instructions</a>.
|
||||||
|
|
||||||
|
\section _plinks Project links
|
||||||
|
- <a HREF="http://jsoncpp.sourceforge.net">json-cpp home</a>
|
||||||
|
- <a HREF="http://www.sourceforge.net/projects/jsoncpp">json-cpp sourceforge project</a>
|
||||||
|
|
||||||
|
\section _rlinks Related links
|
||||||
|
- <a HREF="http://www.json.org/">JSON</a> Specification and alternate language implementations.
|
||||||
|
- <a HREF="http://www.yaml.org/">YAML</a> A data format designed for human readability.
|
||||||
|
- <a HREF="http://www.cl.cam.ac.uk/~mgk25/unicode.html">UTF-8 and Unicode FAQ</a>.
|
||||||
|
|
||||||
|
\section _license License
|
||||||
|
The json-cpp library and this documentation are in Public Domain.
|
||||||
|
|
||||||
|
\author Baptiste Lepilleur <blep@users.sourceforge.net>
|
||||||
|
*/
|
1
PowerEditor/src/jsoncpp/doc/readme.txt
Normal file
1
PowerEditor/src/jsoncpp/doc/readme.txt
Normal file
@ -0,0 +1 @@
|
|||||||
|
The documentation is generated using doxygen (http://www.doxygen.org).
|
32
PowerEditor/src/jsoncpp/doc/roadmap.dox
Normal file
32
PowerEditor/src/jsoncpp/doc/roadmap.dox
Normal file
@ -0,0 +1,32 @@
|
|||||||
|
/*! \page roadmap JsonCpp roadmap
|
||||||
|
\section ms_release Makes JsonCpp ready for release
|
||||||
|
- Build system clean-up:
|
||||||
|
- Fix build on Windows (shared-library build is broken)
|
||||||
|
- Add enable/disable flag for static and shared library build
|
||||||
|
- Enhance help
|
||||||
|
- Platform portability check: (Notes: was ok on last check)
|
||||||
|
- linux/gcc,
|
||||||
|
- solaris/cc,
|
||||||
|
- windows/msvc678,
|
||||||
|
- aix/vacpp
|
||||||
|
- Add JsonCpp version to header as numeric for use in preprocessor test
|
||||||
|
- Remove buggy experimental hash stuff
|
||||||
|
- Release on sourceforge download
|
||||||
|
\section ms_strict Adds a strict mode to reader/parser
|
||||||
|
Strict JSON support as specific in RFC 4627 (http://www.ietf.org/rfc/rfc4627.txt?number=4627).
|
||||||
|
- Enforce only object or array as root element
|
||||||
|
- Disable comment support
|
||||||
|
- Get jsonchecker failing tests to pass in strict mode
|
||||||
|
\section ms_separation Expose json reader/writer API that do not impose using Json::Value.
|
||||||
|
Some typical use-case involve an application specific structure to/from a JSON document.
|
||||||
|
- Event base parser to allow unserializing a Json document directly in datastructure instead of
|
||||||
|
using the intermediate Json::Value.
|
||||||
|
- "Stream" based parser to serialized a Json document without using Json::Value as input.
|
||||||
|
- Performance oriented parser/writer:
|
||||||
|
- Provides an event based parser. Should allow pulling & skipping events for ease of use.
|
||||||
|
- Provides a JSON document builder: fast only.
|
||||||
|
\section ms_perfo Performance tuning
|
||||||
|
- Provides support for static property name definition avoiding allocation
|
||||||
|
- Static property dictionnary can be provided to JSON reader
|
||||||
|
- Performance scenario & benchmarking
|
||||||
|
*/
|
167
PowerEditor/src/jsoncpp/doxybuild.py
Normal file
167
PowerEditor/src/jsoncpp/doxybuild.py
Normal file
@ -0,0 +1,167 @@
|
|||||||
|
"""Script to generate doxygen documentation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
import sys
|
||||||
|
import shutil
|
||||||
|
from devtools import tarball
|
||||||
|
|
||||||
|
def find_program(*filenames):
|
||||||
|
"""find a program in folders path_lst, and sets env[var]
|
||||||
|
@param filenames: a list of possible names of the program to search for
|
||||||
|
@return: the full path of the filename if found, or '' if filename could not be found
|
||||||
|
"""
|
||||||
|
paths = os.environ.get('PATH', '').split(os.pathsep)
|
||||||
|
suffixes = ('win32' in sys.platform ) and '.exe .com .bat .cmd' or ''
|
||||||
|
for filename in filenames:
|
||||||
|
for name in [filename+ext for ext in suffixes.split()]:
|
||||||
|
for directory in paths:
|
||||||
|
full_path = os.path.join(directory, name)
|
||||||
|
if os.path.isfile(full_path):
|
||||||
|
return full_path
|
||||||
|
return ''
|
||||||
|
|
||||||
|
def do_subst_in_file(targetfile, sourcefile, dict):
|
||||||
|
"""Replace all instances of the keys of dict with their values.
|
||||||
|
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
|
||||||
|
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
f = open(sourcefile, 'rb')
|
||||||
|
contents = f.read()
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't read source file %s"%sourcefile
|
||||||
|
raise
|
||||||
|
for (k,v) in dict.items():
|
||||||
|
v = v.replace('\\','\\\\')
|
||||||
|
contents = re.sub(k, v, contents)
|
||||||
|
try:
|
||||||
|
f = open(targetfile, 'wb')
|
||||||
|
f.write(contents)
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
print "Can't write target file %s"%targetfile
|
||||||
|
raise
|
||||||
|
|
||||||
|
def run_doxygen(doxygen_path, config_file, working_dir, is_silent):
|
||||||
|
config_file = os.path.abspath( config_file )
|
||||||
|
doxygen_path = doxygen_path
|
||||||
|
old_cwd = os.getcwd()
|
||||||
|
try:
|
||||||
|
os.chdir( working_dir )
|
||||||
|
cmd = [doxygen_path, config_file]
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
try:
|
||||||
|
import subprocess
|
||||||
|
except:
|
||||||
|
if os.system( ' '.join( cmd ) ) != 0:
|
||||||
|
print 'Documentation generation failed'
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
if is_silent:
|
||||||
|
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
|
||||||
|
else:
|
||||||
|
process = subprocess.Popen( cmd )
|
||||||
|
stdout, _ = process.communicate()
|
||||||
|
if process.returncode:
|
||||||
|
print 'Documentation generation failed:'
|
||||||
|
print stdout
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
finally:
|
||||||
|
os.chdir( old_cwd )
|
||||||
|
|
||||||
|
def build_doc( options, make_release=False ):
|
||||||
|
if make_release:
|
||||||
|
options.make_tarball = True
|
||||||
|
options.with_dot = True
|
||||||
|
options.with_html_help = True
|
||||||
|
options.with_uml_look = True
|
||||||
|
options.open = False
|
||||||
|
options.silent = True
|
||||||
|
|
||||||
|
version = open('version','rt').read().strip()
|
||||||
|
output_dir = 'dist/doxygen' # relative to doc/doxyfile location.
|
||||||
|
if not os.path.isdir( output_dir ):
|
||||||
|
os.makedirs( output_dir )
|
||||||
|
top_dir = os.path.abspath( '.' )
|
||||||
|
html_output_dirname = 'jsoncpp-api-html-' + version
|
||||||
|
tarball_path = os.path.join( 'dist', html_output_dirname + '.tar.gz' )
|
||||||
|
warning_log_path = os.path.join( output_dir, '../jsoncpp-doxygen-warning.log' )
|
||||||
|
html_output_path = os.path.join( output_dir, html_output_dirname )
|
||||||
|
def yesno( bool ):
|
||||||
|
return bool and 'YES' or 'NO'
|
||||||
|
subst_keys = {
|
||||||
|
'%JSONCPP_VERSION%': version,
|
||||||
|
'%DOC_TOPDIR%': '',
|
||||||
|
'%TOPDIR%': top_dir,
|
||||||
|
'%HTML_OUTPUT%': os.path.join( '..', output_dir, html_output_dirname ),
|
||||||
|
'%HAVE_DOT%': yesno(options.with_dot),
|
||||||
|
'%DOT_PATH%': os.path.split(options.dot_path)[0],
|
||||||
|
'%HTML_HELP%': yesno(options.with_html_help),
|
||||||
|
'%UML_LOOK%': yesno(options.with_uml_look),
|
||||||
|
'%WARNING_LOG_PATH%': os.path.join( '..', warning_log_path )
|
||||||
|
}
|
||||||
|
|
||||||
|
if os.path.isdir( output_dir ):
|
||||||
|
print 'Deleting directory:', output_dir
|
||||||
|
shutil.rmtree( output_dir )
|
||||||
|
if not os.path.isdir( output_dir ):
|
||||||
|
os.makedirs( output_dir )
|
||||||
|
|
||||||
|
do_subst_in_file( 'doc/doxyfile', 'doc/doxyfile.in', subst_keys )
|
||||||
|
ok = run_doxygen( options.doxygen_path, 'doc/doxyfile', 'doc', is_silent=options.silent )
|
||||||
|
if not options.silent:
|
||||||
|
print open(warning_log_path, 'rb').read()
|
||||||
|
index_path = os.path.abspath(os.path.join(subst_keys['%HTML_OUTPUT%'], 'index.html'))
|
||||||
|
print 'Generated documentation can be found in:'
|
||||||
|
print index_path
|
||||||
|
if options.open:
|
||||||
|
import webbrowser
|
||||||
|
webbrowser.open( 'file://' + index_path )
|
||||||
|
if options.make_tarball:
|
||||||
|
print 'Generating doc tarball to', tarball_path
|
||||||
|
tarball_sources = [
|
||||||
|
output_dir,
|
||||||
|
'README.txt',
|
||||||
|
'version'
|
||||||
|
]
|
||||||
|
tarball_basedir = os.path.join( output_dir, html_output_dirname )
|
||||||
|
tarball.make_tarball( tarball_path, tarball_sources, tarball_basedir, html_output_dirname )
|
||||||
|
return tarball_path, html_output_dirname
|
||||||
|
|
||||||
|
def main():
|
||||||
|
usage = """%prog
|
||||||
|
Generates doxygen documentation in build/doxygen.
|
||||||
|
Optionaly makes a tarball of the documentation to dist/.
|
||||||
|
|
||||||
|
Must be started in the project top directory.
|
||||||
|
"""
|
||||||
|
from optparse import OptionParser
|
||||||
|
parser = OptionParser(usage=usage)
|
||||||
|
parser.allow_interspersed_args = False
|
||||||
|
parser.add_option('--with-dot', dest="with_dot", action='store_true', default=False,
|
||||||
|
help="""Enable usage of DOT to generate collaboration diagram""")
|
||||||
|
parser.add_option('--dot', dest="dot_path", action='store', default=find_program('dot'),
|
||||||
|
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
|
||||||
|
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=find_program('doxygen'),
|
||||||
|
help="""Path to Doxygen tool. [Default: %default]""")
|
||||||
|
parser.add_option('--with-html-help', dest="with_html_help", action='store_true', default=False,
|
||||||
|
help="""Enable generation of Microsoft HTML HELP""")
|
||||||
|
parser.add_option('--no-uml-look', dest="with_uml_look", action='store_false', default=True,
|
||||||
|
help="""Generates DOT graph without UML look [Default: False]""")
|
||||||
|
parser.add_option('--open', dest="open", action='store_true', default=False,
|
||||||
|
help="""Open the HTML index in the web browser after generation""")
|
||||||
|
parser.add_option('--tarball', dest="make_tarball", action='store_true', default=False,
|
||||||
|
help="""Generates a tarball of the documentation in dist/ directory""")
|
||||||
|
parser.add_option('-s', '--silent', dest="silent", action='store_true', default=False,
|
||||||
|
help="""Hides doxygen output""")
|
||||||
|
parser.enable_interspersed_args()
|
||||||
|
options, args = parser.parse_args()
|
||||||
|
build_doc( options )
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
19
PowerEditor/src/jsoncpp/include/json/autolink.h
Normal file
19
PowerEditor/src/jsoncpp/include/json/autolink.h
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
#ifndef JSON_AUTOLINK_H_INCLUDED
|
||||||
|
# define JSON_AUTOLINK_H_INCLUDED
|
||||||
|
|
||||||
|
# include "config.h"
|
||||||
|
|
||||||
|
# ifdef JSON_IN_CPPTL
|
||||||
|
# include <cpptl/cpptl_autolink.h>
|
||||||
|
# endif
|
||||||
|
|
||||||
|
# if !defined(JSON_NO_AUTOLINK) && !defined(JSON_DLL_BUILD) && !defined(JSON_IN_CPPTL)
|
||||||
|
# define CPPTL_AUTOLINK_NAME "json"
|
||||||
|
# undef CPPTL_AUTOLINK_DLL
|
||||||
|
# ifdef JSON_DLL
|
||||||
|
# define CPPTL_AUTOLINK_DLL
|
||||||
|
# endif
|
||||||
|
# include "autolink.h"
|
||||||
|
# endif
|
||||||
|
|
||||||
|
#endif // JSON_AUTOLINK_H_INCLUDED
|
43
PowerEditor/src/jsoncpp/include/json/config.h
Normal file
43
PowerEditor/src/jsoncpp/include/json/config.h
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
#ifndef JSON_CONFIG_H_INCLUDED
|
||||||
|
# define JSON_CONFIG_H_INCLUDED
|
||||||
|
|
||||||
|
/// If defined, indicates that json library is embedded in CppTL library.
|
||||||
|
//# define JSON_IN_CPPTL 1
|
||||||
|
|
||||||
|
/// If defined, indicates that json may leverage CppTL library
|
||||||
|
//# define JSON_USE_CPPTL 1
|
||||||
|
/// If defined, indicates that cpptl vector based map should be used instead of std::map
|
||||||
|
/// as Value container.
|
||||||
|
//# define JSON_USE_CPPTL_SMALLMAP 1
|
||||||
|
/// If defined, indicates that Json specific container should be used
|
||||||
|
/// (hash table & simple deque container with customizable allocator).
|
||||||
|
/// THIS FEATURE IS STILL EXPERIMENTAL!
|
||||||
|
//# define JSON_VALUE_USE_INTERNAL_MAP 1
|
||||||
|
/// Force usage of standard new/malloc based allocator instead of memory pool based allocator.
|
||||||
|
/// The memory pools allocator used optimization (initializing Value and ValueInternalLink
|
||||||
|
/// as if it was a POD) that may cause some validation tool to report errors.
|
||||||
|
/// Only has effects if JSON_VALUE_USE_INTERNAL_MAP is defined.
|
||||||
|
//# define JSON_USE_SIMPLE_INTERNAL_ALLOCATOR 1
|
||||||
|
|
||||||
|
/// If defined, indicates that Json use exception to report invalid type manipulation
|
||||||
|
/// instead of C assert macro.
|
||||||
|
# define JSON_USE_EXCEPTION 1
|
||||||
|
|
||||||
|
# ifdef JSON_IN_CPPTL
|
||||||
|
# include <cpptl/config.h>
|
||||||
|
# ifndef JSON_USE_CPPTL
|
||||||
|
# define JSON_USE_CPPTL 1
|
||||||
|
# endif
|
||||||
|
# endif
|
||||||
|
|
||||||
|
# ifdef JSON_IN_CPPTL
|
||||||
|
# define JSON_API CPPTL_API
|
||||||
|
# elif defined(JSON_DLL_BUILD)
|
||||||
|
# define JSON_API __declspec(dllexport)
|
||||||
|
# elif defined(JSON_DLL)
|
||||||
|
# define JSON_API __declspec(dllimport)
|
||||||
|
# else
|
||||||
|
# define JSON_API
|
||||||
|
# endif
|
||||||
|
|
||||||
|
#endif // JSON_CONFIG_H_INCLUDED
|
42
PowerEditor/src/jsoncpp/include/json/features.h
Normal file
42
PowerEditor/src/jsoncpp/include/json/features.h
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
#ifndef CPPTL_JSON_FEATURES_H_INCLUDED
|
||||||
|
# define CPPTL_JSON_FEATURES_H_INCLUDED
|
||||||
|
|
||||||
|
# include "forwards.h"
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
/** \brief Configuration passed to reader and writer.
|
||||||
|
* This configuration object can be used to force the Reader or Writer
|
||||||
|
* to behave in a standard conforming way.
|
||||||
|
*/
|
||||||
|
class JSON_API Features
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
/** \brief A configuration that allows all features and assumes all strings are UTF-8.
|
||||||
|
* - C & C++ comments are allowed
|
||||||
|
* - Root object can be any JSON value
|
||||||
|
* - Assumes Value strings are encoded in UTF-8
|
||||||
|
*/
|
||||||
|
static Features all();
|
||||||
|
|
||||||
|
/** \brief A configuration that is strictly compatible with the JSON specification.
|
||||||
|
* - Comments are forbidden.
|
||||||
|
* - Root object must be either an array or an object value.
|
||||||
|
* - Assumes Value strings are encoded in UTF-8
|
||||||
|
*/
|
||||||
|
static Features strictMode();
|
||||||
|
|
||||||
|
/** \brief Initialize the configuration like JsonConfig::allFeatures;
|
||||||
|
*/
|
||||||
|
Features();
|
||||||
|
|
||||||
|
/// \c true if comments are allowed. Default: \c true.
|
||||||
|
bool allowComments_;
|
||||||
|
|
||||||
|
/// \c true if root must be either an array or an object value. Default: \c false.
|
||||||
|
bool strictRoot_;
|
||||||
|
};
|
||||||
|
|
||||||
|
} // namespace Json
|
||||||
|
|
||||||
|
#endif // CPPTL_JSON_FEATURES_H_INCLUDED
|
39
PowerEditor/src/jsoncpp/include/json/forwards.h
Normal file
39
PowerEditor/src/jsoncpp/include/json/forwards.h
Normal file
@ -0,0 +1,39 @@
|
|||||||
|
#ifndef JSON_FORWARDS_H_INCLUDED
|
||||||
|
# define JSON_FORWARDS_H_INCLUDED
|
||||||
|
|
||||||
|
# include "config.h"
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
// writer.h
|
||||||
|
class FastWriter;
|
||||||
|
class StyledWriter;
|
||||||
|
|
||||||
|
// reader.h
|
||||||
|
class Reader;
|
||||||
|
|
||||||
|
// features.h
|
||||||
|
class Features;
|
||||||
|
|
||||||
|
// value.h
|
||||||
|
typedef int Int;
|
||||||
|
typedef unsigned int UInt;
|
||||||
|
class StaticString;
|
||||||
|
class Path;
|
||||||
|
class PathArgument;
|
||||||
|
class Value;
|
||||||
|
class ValueIteratorBase;
|
||||||
|
class ValueIterator;
|
||||||
|
class ValueConstIterator;
|
||||||
|
#ifdef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
class ValueAllocator;
|
||||||
|
class ValueMapAllocator;
|
||||||
|
class ValueInternalLink;
|
||||||
|
class ValueInternalArray;
|
||||||
|
class ValueInternalMap;
|
||||||
|
#endif // #ifdef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
|
||||||
|
} // namespace Json
|
||||||
|
|
||||||
|
|
||||||
|
#endif // JSON_FORWARDS_H_INCLUDED
|
10
PowerEditor/src/jsoncpp/include/json/json.h
Normal file
10
PowerEditor/src/jsoncpp/include/json/json.h
Normal file
@ -0,0 +1,10 @@
|
|||||||
|
#ifndef JSON_JSON_H_INCLUDED
|
||||||
|
# define JSON_JSON_H_INCLUDED
|
||||||
|
|
||||||
|
# include "autolink.h"
|
||||||
|
# include "value.h"
|
||||||
|
# include "reader.h"
|
||||||
|
# include "writer.h"
|
||||||
|
# include "features.h"
|
||||||
|
|
||||||
|
#endif // JSON_JSON_H_INCLUDED
|
196
PowerEditor/src/jsoncpp/include/json/reader.h
Normal file
196
PowerEditor/src/jsoncpp/include/json/reader.h
Normal file
@ -0,0 +1,196 @@
|
|||||||
|
#ifndef CPPTL_JSON_READER_H_INCLUDED
|
||||||
|
# define CPPTL_JSON_READER_H_INCLUDED
|
||||||
|
|
||||||
|
# include "features.h"
|
||||||
|
# include "value.h"
|
||||||
|
# include <deque>
|
||||||
|
# include <stack>
|
||||||
|
# include <string>
|
||||||
|
# include <iostream>
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
/** \brief Unserialize a <a HREF="http://www.json.org">JSON</a> document into a Value.
|
||||||
|
*
|
||||||
|
*/
|
||||||
|
class JSON_API Reader
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
typedef char Char;
|
||||||
|
typedef const Char *Location;
|
||||||
|
|
||||||
|
/** \brief Constructs a Reader allowing all features
|
||||||
|
* for parsing.
|
||||||
|
*/
|
||||||
|
Reader();
|
||||||
|
|
||||||
|
/** \brief Constructs a Reader allowing the specified feature set
|
||||||
|
* for parsing.
|
||||||
|
*/
|
||||||
|
Reader( const Features &features );
|
||||||
|
|
||||||
|
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a> document.
|
||||||
|
* \param document UTF-8 encoded string containing the document to read.
|
||||||
|
* \param root [out] Contains the root value of the document if it was
|
||||||
|
* successfully parsed.
|
||||||
|
* \param collectComments \c true to collect comment and allow writing them back during
|
||||||
|
* serialization, \c false to discard comments.
|
||||||
|
* This parameter is ignored if Features::allowComments_
|
||||||
|
* is \c false.
|
||||||
|
* \return \c true if the document was successfully parsed, \c false if an error occurred.
|
||||||
|
*/
|
||||||
|
bool parse( const std::string &document,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments = true );
|
||||||
|
|
||||||
|
/** \brief Read a Value from a <a HREF="http://www.json.org">JSON</a> document.
|
||||||
|
* \param document UTF-8 encoded string containing the document to read.
|
||||||
|
* \param root [out] Contains the root value of the document if it was
|
||||||
|
* successfully parsed.
|
||||||
|
* \param collectComments \c true to collect comment and allow writing them back during
|
||||||
|
* serialization, \c false to discard comments.
|
||||||
|
* This parameter is ignored if Features::allowComments_
|
||||||
|
* is \c false.
|
||||||
|
* \return \c true if the document was successfully parsed, \c false if an error occurred.
|
||||||
|
*/
|
||||||
|
bool parse( const char *beginDoc, const char *endDoc,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments = true );
|
||||||
|
|
||||||
|
/// \brief Parse from input stream.
|
||||||
|
/// \see Json::operator>>(std::istream&, Json::Value&).
|
||||||
|
bool parse( std::istream &is,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments = true );
|
||||||
|
|
||||||
|
/** \brief Returns a user friendly string that list errors in the parsed document.
|
||||||
|
* \return Formatted error message with the list of errors with their location in
|
||||||
|
* the parsed document. An empty string is returned if no error occurred
|
||||||
|
* during parsing.
|
||||||
|
*/
|
||||||
|
std::string getFormatedErrorMessages() const;
|
||||||
|
|
||||||
|
private:
|
||||||
|
enum TokenType
|
||||||
|
{
|
||||||
|
tokenEndOfStream = 0,
|
||||||
|
tokenObjectBegin,
|
||||||
|
tokenObjectEnd,
|
||||||
|
tokenArrayBegin,
|
||||||
|
tokenArrayEnd,
|
||||||
|
tokenString,
|
||||||
|
tokenNumber,
|
||||||
|
tokenTrue,
|
||||||
|
tokenFalse,
|
||||||
|
tokenNull,
|
||||||
|
tokenArraySeparator,
|
||||||
|
tokenMemberSeparator,
|
||||||
|
tokenComment,
|
||||||
|
tokenError
|
||||||
|
};
|
||||||
|
|
||||||
|
class Token
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
TokenType type_;
|
||||||
|
Location start_;
|
||||||
|
Location end_;
|
||||||
|
};
|
||||||
|
|
||||||
|
class ErrorInfo
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
Token token_;
|
||||||
|
std::string message_;
|
||||||
|
Location extra_;
|
||||||
|
};
|
||||||
|
|
||||||
|
typedef std::deque<ErrorInfo> Errors;
|
||||||
|
|
||||||
|
bool expectToken( TokenType type, Token &token, const char *message );
|
||||||
|
bool readToken( Token &token );
|
||||||
|
void skipSpaces();
|
||||||
|
bool match( Location pattern,
|
||||||
|
int patternLength );
|
||||||
|
bool readComment();
|
||||||
|
bool readCStyleComment();
|
||||||
|
bool readCppStyleComment();
|
||||||
|
bool readString();
|
||||||
|
void readNumber();
|
||||||
|
bool readValue();
|
||||||
|
bool readObject( Token &token );
|
||||||
|
bool readArray( Token &token );
|
||||||
|
bool decodeNumber( Token &token );
|
||||||
|
bool decodeString( Token &token );
|
||||||
|
bool decodeString( Token &token, std::string &decoded );
|
||||||
|
bool decodeDouble( Token &token );
|
||||||
|
bool decodeUnicodeCodePoint( Token &token,
|
||||||
|
Location ¤t,
|
||||||
|
Location end,
|
||||||
|
unsigned int &unicode );
|
||||||
|
bool decodeUnicodeEscapeSequence( Token &token,
|
||||||
|
Location ¤t,
|
||||||
|
Location end,
|
||||||
|
unsigned int &unicode );
|
||||||
|
bool addError( const std::string &message,
|
||||||
|
Token &token,
|
||||||
|
Location extra = 0 );
|
||||||
|
bool recoverFromError( TokenType skipUntilToken );
|
||||||
|
bool addErrorAndRecover( const std::string &message,
|
||||||
|
Token &token,
|
||||||
|
TokenType skipUntilToken );
|
||||||
|
void skipUntilSpace();
|
||||||
|
Value ¤tValue();
|
||||||
|
Char getNextChar();
|
||||||
|
void getLocationLineAndColumn( Location location,
|
||||||
|
int &line,
|
||||||
|
int &column ) const;
|
||||||
|
std::string getLocationLineAndColumn( Location location ) const;
|
||||||
|
void addComment( Location begin,
|
||||||
|
Location end,
|
||||||
|
CommentPlacement placement );
|
||||||
|
void skipCommentTokens( Token &token );
|
||||||
|
|
||||||
|
typedef std::stack<Value *> Nodes;
|
||||||
|
Nodes nodes_;
|
||||||
|
Errors errors_;
|
||||||
|
std::string document_;
|
||||||
|
Location begin_;
|
||||||
|
Location end_;
|
||||||
|
Location current_;
|
||||||
|
Location lastValueEnd_;
|
||||||
|
Value *lastValue_;
|
||||||
|
std::string commentsBefore_;
|
||||||
|
Features features_;
|
||||||
|
bool collectComments_;
|
||||||
|
};
|
||||||
|
|
||||||
|
/** \brief Read from 'sin' into 'root'.
|
||||||
|
|
||||||
|
Always keep comments from the input JSON.
|
||||||
|
|
||||||
|
This can be used to read a file into a particular sub-object.
|
||||||
|
For example:
|
||||||
|
\code
|
||||||
|
Json::Value root;
|
||||||
|
cin >> root["dir"]["file"];
|
||||||
|
cout << root;
|
||||||
|
\endcode
|
||||||
|
Result:
|
||||||
|
\verbatim
|
||||||
|
{
|
||||||
|
"dir": {
|
||||||
|
"file": {
|
||||||
|
// The input stream JSON would be nested here.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
\endverbatim
|
||||||
|
\throw std::exception on parse error.
|
||||||
|
\see Json::operator<<()
|
||||||
|
*/
|
||||||
|
std::istream& operator>>( std::istream&, Value& );
|
||||||
|
|
||||||
|
} // namespace Json
|
||||||
|
|
||||||
|
#endif // CPPTL_JSON_READER_H_INCLUDED
|
1069
PowerEditor/src/jsoncpp/include/json/value.h
Normal file
1069
PowerEditor/src/jsoncpp/include/json/value.h
Normal file
File diff suppressed because it is too large
Load Diff
174
PowerEditor/src/jsoncpp/include/json/writer.h
Normal file
174
PowerEditor/src/jsoncpp/include/json/writer.h
Normal file
@ -0,0 +1,174 @@
|
|||||||
|
#ifndef JSON_WRITER_H_INCLUDED
|
||||||
|
# define JSON_WRITER_H_INCLUDED
|
||||||
|
|
||||||
|
# include "value.h"
|
||||||
|
# include <vector>
|
||||||
|
# include <string>
|
||||||
|
# include <iostream>
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
class Value;
|
||||||
|
|
||||||
|
/** \brief Abstract class for writers.
|
||||||
|
*/
|
||||||
|
class JSON_API Writer
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
virtual ~Writer();
|
||||||
|
|
||||||
|
virtual std::string write( const Value &root ) = 0;
|
||||||
|
};
|
||||||
|
|
||||||
|
/** \brief Outputs a Value in <a HREF="http://www.json.org">JSON</a> format without formatting (not human friendly).
|
||||||
|
*
|
||||||
|
* The JSON document is written in a single line. It is not intended for 'human' consumption,
|
||||||
|
* but may be usefull to support feature such as RPC where bandwith is limited.
|
||||||
|
* \sa Reader, Value
|
||||||
|
*/
|
||||||
|
class JSON_API FastWriter : public Writer
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
FastWriter();
|
||||||
|
virtual ~FastWriter(){}
|
||||||
|
|
||||||
|
void enableYAMLCompatibility();
|
||||||
|
|
||||||
|
public: // overridden from Writer
|
||||||
|
virtual std::string write( const Value &root );
|
||||||
|
|
||||||
|
private:
|
||||||
|
void writeValue( const Value &value );
|
||||||
|
|
||||||
|
std::string document_;
|
||||||
|
bool yamlCompatiblityEnabled_;
|
||||||
|
};
|
||||||
|
|
||||||
|
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a human friendly way.
|
||||||
|
*
|
||||||
|
* The rules for line break and indent are as follow:
|
||||||
|
* - Object value:
|
||||||
|
* - if empty then print {} without indent and line break
|
||||||
|
* - if not empty the print '{', line break & indent, print one value per line
|
||||||
|
* and then unindent and line break and print '}'.
|
||||||
|
* - Array value:
|
||||||
|
* - if empty then print [] without indent and line break
|
||||||
|
* - if the array contains no object value, empty array or some other value types,
|
||||||
|
* and all the values fit on one lines, then print the array on a single line.
|
||||||
|
* - otherwise, it the values do not fit on one line, or the array contains
|
||||||
|
* object or non empty array, then print one value per line.
|
||||||
|
*
|
||||||
|
* If the Value have comments then they are outputed according to their #CommentPlacement.
|
||||||
|
*
|
||||||
|
* \sa Reader, Value, Value::setComment()
|
||||||
|
*/
|
||||||
|
class JSON_API StyledWriter: public Writer
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
StyledWriter();
|
||||||
|
virtual ~StyledWriter(){}
|
||||||
|
|
||||||
|
public: // overridden from Writer
|
||||||
|
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
|
||||||
|
* \param root Value to serialize.
|
||||||
|
* \return String containing the JSON document that represents the root value.
|
||||||
|
*/
|
||||||
|
virtual std::string write( const Value &root );
|
||||||
|
|
||||||
|
private:
|
||||||
|
void writeValue( const Value &value );
|
||||||
|
void writeArrayValue( const Value &value );
|
||||||
|
bool isMultineArray( const Value &value );
|
||||||
|
void pushValue( const std::string &value );
|
||||||
|
void writeIndent();
|
||||||
|
void writeWithIndent( const std::string &value );
|
||||||
|
void indent();
|
||||||
|
void unindent();
|
||||||
|
void writeCommentBeforeValue( const Value &root );
|
||||||
|
void writeCommentAfterValueOnSameLine( const Value &root );
|
||||||
|
bool hasCommentForValue( const Value &value );
|
||||||
|
static std::string normalizeEOL( const std::string &text );
|
||||||
|
|
||||||
|
typedef std::vector<std::string> ChildValues;
|
||||||
|
|
||||||
|
ChildValues childValues_;
|
||||||
|
std::string document_;
|
||||||
|
std::string indentString_;
|
||||||
|
int rightMargin_;
|
||||||
|
int indentSize_;
|
||||||
|
bool addChildValues_;
|
||||||
|
};
|
||||||
|
|
||||||
|
/** \brief Writes a Value in <a HREF="http://www.json.org">JSON</a> format in a human friendly way,
|
||||||
|
to a stream rather than to a string.
|
||||||
|
*
|
||||||
|
* The rules for line break and indent are as follow:
|
||||||
|
* - Object value:
|
||||||
|
* - if empty then print {} without indent and line break
|
||||||
|
* - if not empty the print '{', line break & indent, print one value per line
|
||||||
|
* and then unindent and line break and print '}'.
|
||||||
|
* - Array value:
|
||||||
|
* - if empty then print [] without indent and line break
|
||||||
|
* - if the array contains no object value, empty array or some other value types,
|
||||||
|
* and all the values fit on one lines, then print the array on a single line.
|
||||||
|
* - otherwise, it the values do not fit on one line, or the array contains
|
||||||
|
* object or non empty array, then print one value per line.
|
||||||
|
*
|
||||||
|
* If the Value have comments then they are outputed according to their #CommentPlacement.
|
||||||
|
*
|
||||||
|
* \param indentation Each level will be indented by this amount extra.
|
||||||
|
* \sa Reader, Value, Value::setComment()
|
||||||
|
*/
|
||||||
|
class JSON_API StyledStreamWriter
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
StyledStreamWriter( std::string indentation="\t" );
|
||||||
|
~StyledStreamWriter(){}
|
||||||
|
|
||||||
|
public:
|
||||||
|
/** \brief Serialize a Value in <a HREF="http://www.json.org">JSON</a> format.
|
||||||
|
* \param out Stream to write to. (Can be ostringstream, e.g.)
|
||||||
|
* \param root Value to serialize.
|
||||||
|
* \note There is no point in deriving from Writer, since write() should not return a value.
|
||||||
|
*/
|
||||||
|
void write( std::ostream &out, const Value &root );
|
||||||
|
|
||||||
|
private:
|
||||||
|
void writeValue( const Value &value );
|
||||||
|
void writeArrayValue( const Value &value );
|
||||||
|
bool isMultineArray( const Value &value );
|
||||||
|
void pushValue( const std::string &value );
|
||||||
|
void writeIndent();
|
||||||
|
void writeWithIndent( const std::string &value );
|
||||||
|
void indent();
|
||||||
|
void unindent();
|
||||||
|
void writeCommentBeforeValue( const Value &root );
|
||||||
|
void writeCommentAfterValueOnSameLine( const Value &root );
|
||||||
|
bool hasCommentForValue( const Value &value );
|
||||||
|
static std::string normalizeEOL( const std::string &text );
|
||||||
|
|
||||||
|
typedef std::vector<std::string> ChildValues;
|
||||||
|
|
||||||
|
ChildValues childValues_;
|
||||||
|
std::ostream* document_;
|
||||||
|
std::string indentString_;
|
||||||
|
int rightMargin_;
|
||||||
|
std::string indentation_;
|
||||||
|
bool addChildValues_;
|
||||||
|
};
|
||||||
|
|
||||||
|
std::string JSON_API valueToString( Int value );
|
||||||
|
std::string JSON_API valueToString( UInt value );
|
||||||
|
std::string JSON_API valueToString( double value );
|
||||||
|
std::string JSON_API valueToString( bool value );
|
||||||
|
std::string JSON_API valueToQuotedString( const char *value );
|
||||||
|
|
||||||
|
/// \brief Output using the StyledStreamWriter.
|
||||||
|
/// \see Json::operator>>()
|
||||||
|
std::ostream& operator<<( std::ostream&, const Value &root );
|
||||||
|
|
||||||
|
} // namespace Json
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
#endif // JSON_WRITER_H_INCLUDED
|
44
PowerEditor/src/jsoncpp/makefiles/vs71/jsoncpp.sln
Normal file
44
PowerEditor/src/jsoncpp/makefiles/vs71/jsoncpp.sln
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
Microsoft Visual Studio Solution File, Format Version 9.00
|
||||||
|
# Visual Studio 2005
|
||||||
|
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "lib_json", "lib_json.vcproj", "{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
|
||||||
|
EndProject
|
||||||
|
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "jsontest", "jsontest.vcproj", "{25AF2DD2-D396-4668-B188-488C33B8E620}"
|
||||||
|
ProjectSection(ProjectDependencies) = postProject
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B} = {B84F7231-16CE-41D8-8C08-7B523FF4225B}
|
||||||
|
EndProjectSection
|
||||||
|
EndProject
|
||||||
|
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test_lib_json", "test_lib_json.vcproj", "{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
|
||||||
|
ProjectSection(ProjectDependencies) = postProject
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B} = {B84F7231-16CE-41D8-8C08-7B523FF4225B}
|
||||||
|
EndProjectSection
|
||||||
|
EndProject
|
||||||
|
Global
|
||||||
|
GlobalSection(SolutionConfigurationPlatforms) = preSolution
|
||||||
|
Debug|Win32 = Debug|Win32
|
||||||
|
dummy|Win32 = dummy|Win32
|
||||||
|
Release|Win32 = Release|Win32
|
||||||
|
EndGlobalSection
|
||||||
|
GlobalSection(ProjectConfigurationPlatforms) = postSolution
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Debug|Win32.ActiveCfg = Debug|Win32
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Debug|Win32.Build.0 = Debug|Win32
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.dummy|Win32.ActiveCfg = dummy|Win32
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.dummy|Win32.Build.0 = dummy|Win32
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Release|Win32.ActiveCfg = Release|Win32
|
||||||
|
{B84F7231-16CE-41D8-8C08-7B523FF4225B}.Release|Win32.Build.0 = Release|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.ActiveCfg = Debug|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.Debug|Win32.Build.0 = Debug|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.dummy|Win32.ActiveCfg = Debug|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.dummy|Win32.Build.0 = Debug|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.ActiveCfg = Release|Win32
|
||||||
|
{25AF2DD2-D396-4668-B188-488C33B8E620}.Release|Win32.Build.0 = Release|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.ActiveCfg = Debug|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Debug|Win32.Build.0 = Debug|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.dummy|Win32.ActiveCfg = Debug|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.dummy|Win32.Build.0 = Debug|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.ActiveCfg = Release|Win32
|
||||||
|
{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}.Release|Win32.Build.0 = Release|Win32
|
||||||
|
EndGlobalSection
|
||||||
|
GlobalSection(SolutionProperties) = preSolution
|
||||||
|
HideSolutionNode = FALSE
|
||||||
|
EndGlobalSection
|
||||||
|
EndGlobal
|
184
PowerEditor/src/jsoncpp/makefiles/vs71/jsontest.vcproj
Normal file
184
PowerEditor/src/jsoncpp/makefiles/vs71/jsontest.vcproj
Normal file
@ -0,0 +1,184 @@
|
|||||||
|
<?xml version="1.0" encoding="Windows-1252"?>
|
||||||
|
<VisualStudioProject
|
||||||
|
ProjectType="Visual C++"
|
||||||
|
Version="8,00"
|
||||||
|
Name="jsontest"
|
||||||
|
ProjectGUID="{25AF2DD2-D396-4668-B188-488C33B8E620}"
|
||||||
|
Keyword="Win32Proj"
|
||||||
|
>
|
||||||
|
<Platforms>
|
||||||
|
<Platform
|
||||||
|
Name="Win32"
|
||||||
|
/>
|
||||||
|
</Platforms>
|
||||||
|
<ToolFiles>
|
||||||
|
</ToolFiles>
|
||||||
|
<Configurations>
|
||||||
|
<Configuration
|
||||||
|
Name="Debug|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/debug/jsontest"
|
||||||
|
IntermediateDirectory="../../build/vs71/debug/jsontest"
|
||||||
|
ConfigurationType="1"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
Optimization="0"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE"
|
||||||
|
MinimalRebuild="true"
|
||||||
|
BasicRuntimeChecks="3"
|
||||||
|
RuntimeLibrary="1"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="4"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLinkerTool"
|
||||||
|
OutputFile="$(OutDir)/jsontest.exe"
|
||||||
|
LinkIncremental="2"
|
||||||
|
GenerateDebugInformation="true"
|
||||||
|
ProgramDatabaseFile="$(OutDir)/jsontest.pdb"
|
||||||
|
SubSystem="1"
|
||||||
|
TargetMachine="1"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManifestTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCAppVerifierTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebDeploymentTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
<Configuration
|
||||||
|
Name="Release|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/release/jsontest"
|
||||||
|
IntermediateDirectory="../../build/vs71/release/jsontest"
|
||||||
|
ConfigurationType="1"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE"
|
||||||
|
RuntimeLibrary="0"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="3"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLinkerTool"
|
||||||
|
OutputFile="$(OutDir)/jsontest.exe"
|
||||||
|
LinkIncremental="1"
|
||||||
|
GenerateDebugInformation="true"
|
||||||
|
SubSystem="1"
|
||||||
|
OptimizeReferences="2"
|
||||||
|
EnableCOMDATFolding="2"
|
||||||
|
TargetMachine="1"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManifestTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCAppVerifierTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebDeploymentTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
</Configurations>
|
||||||
|
<References>
|
||||||
|
</References>
|
||||||
|
<Files>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\jsontestrunner\main.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
</Files>
|
||||||
|
<Globals>
|
||||||
|
</Globals>
|
||||||
|
</VisualStudioProject>
|
308
PowerEditor/src/jsoncpp/makefiles/vs71/lib_json.vcproj
Normal file
308
PowerEditor/src/jsoncpp/makefiles/vs71/lib_json.vcproj
Normal file
@ -0,0 +1,308 @@
|
|||||||
|
<?xml version="1.0" encoding="Windows-1252"?>
|
||||||
|
<VisualStudioProject
|
||||||
|
ProjectType="Visual C++"
|
||||||
|
Version="8,00"
|
||||||
|
Name="lib_json"
|
||||||
|
ProjectGUID="{B84F7231-16CE-41D8-8C08-7B523FF4225B}"
|
||||||
|
RootNamespace="lib_json"
|
||||||
|
Keyword="Win32Proj"
|
||||||
|
>
|
||||||
|
<Platforms>
|
||||||
|
<Platform
|
||||||
|
Name="Win32"
|
||||||
|
/>
|
||||||
|
</Platforms>
|
||||||
|
<ToolFiles>
|
||||||
|
</ToolFiles>
|
||||||
|
<Configurations>
|
||||||
|
<Configuration
|
||||||
|
Name="Debug|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/debug/lib_json"
|
||||||
|
IntermediateDirectory="../../build/vs71/debug/lib_json"
|
||||||
|
ConfigurationType="4"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
Optimization="0"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;_DEBUG;_LIB"
|
||||||
|
StringPooling="true"
|
||||||
|
MinimalRebuild="true"
|
||||||
|
BasicRuntimeChecks="3"
|
||||||
|
RuntimeLibrary="1"
|
||||||
|
EnableFunctionLevelLinking="true"
|
||||||
|
DisableLanguageExtensions="true"
|
||||||
|
ForceConformanceInForLoopScope="false"
|
||||||
|
RuntimeTypeInfo="true"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="4"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLibrarianTool"
|
||||||
|
OutputFile="$(OutDir)/json_vc71_libmtd.lib"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
<Configuration
|
||||||
|
Name="Release|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/release/lib_json"
|
||||||
|
IntermediateDirectory="../../build/vs71/release/lib_json"
|
||||||
|
ConfigurationType="4"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
WholeProgramOptimization="1"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
EnableIntrinsicFunctions="true"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
||||||
|
StringPooling="true"
|
||||||
|
RuntimeLibrary="0"
|
||||||
|
EnableFunctionLevelLinking="true"
|
||||||
|
DisableLanguageExtensions="true"
|
||||||
|
ForceConformanceInForLoopScope="false"
|
||||||
|
RuntimeTypeInfo="true"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
AssemblerOutput="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="3"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLibrarianTool"
|
||||||
|
OutputFile="$(OutDir)/json_vc71_libmt.lib"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
<Configuration
|
||||||
|
Name="dummy|Win32"
|
||||||
|
OutputDirectory="$(ConfigurationName)"
|
||||||
|
IntermediateDirectory="$(ConfigurationName)"
|
||||||
|
ConfigurationType="2"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
WholeProgramOptimization="1"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
EnableIntrinsicFunctions="true"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
|
||||||
|
StringPooling="true"
|
||||||
|
RuntimeLibrary="0"
|
||||||
|
EnableFunctionLevelLinking="true"
|
||||||
|
DisableLanguageExtensions="true"
|
||||||
|
ForceConformanceInForLoopScope="false"
|
||||||
|
RuntimeTypeInfo="true"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
AssemblerOutput="4"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="3"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLinkerTool"
|
||||||
|
GenerateDebugInformation="true"
|
||||||
|
SubSystem="2"
|
||||||
|
OptimizeReferences="2"
|
||||||
|
EnableCOMDATFolding="2"
|
||||||
|
TargetMachine="1"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManifestTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCAppVerifierTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebDeploymentTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
</Configurations>
|
||||||
|
<References>
|
||||||
|
</References>
|
||||||
|
<Files>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\autolink.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\config.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\features.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\forwards.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\json.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_batchallocator.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_internalarray.inl"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_internalmap.inl"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_reader.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_value.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_valueiterator.inl"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\lib_json\json_writer.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\reader.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\value.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\include\json\writer.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
</Files>
|
||||||
|
<Globals>
|
||||||
|
</Globals>
|
||||||
|
</VisualStudioProject>
|
197
PowerEditor/src/jsoncpp/makefiles/vs71/test_lib_json.vcproj
Normal file
197
PowerEditor/src/jsoncpp/makefiles/vs71/test_lib_json.vcproj
Normal file
@ -0,0 +1,197 @@
|
|||||||
|
<?xml version="1.0" encoding="Windows-1252"?>
|
||||||
|
<VisualStudioProject
|
||||||
|
ProjectType="Visual C++"
|
||||||
|
Version="8,00"
|
||||||
|
Name="test_lib_json"
|
||||||
|
ProjectGUID="{B7A96B78-2782-40D2-8F37-A2DEF2B9C26D}"
|
||||||
|
RootNamespace="test_lib_json"
|
||||||
|
Keyword="Win32Proj"
|
||||||
|
>
|
||||||
|
<Platforms>
|
||||||
|
<Platform
|
||||||
|
Name="Win32"
|
||||||
|
/>
|
||||||
|
</Platforms>
|
||||||
|
<ToolFiles>
|
||||||
|
</ToolFiles>
|
||||||
|
<Configurations>
|
||||||
|
<Configuration
|
||||||
|
Name="Debug|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/debug/test_lib_json"
|
||||||
|
IntermediateDirectory="../../build/vs71/debug/test_lib_json"
|
||||||
|
ConfigurationType="1"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
Optimization="0"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE"
|
||||||
|
MinimalRebuild="true"
|
||||||
|
BasicRuntimeChecks="3"
|
||||||
|
RuntimeLibrary="1"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="4"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLinkerTool"
|
||||||
|
OutputFile="$(OutDir)/test_lib_json.exe"
|
||||||
|
LinkIncremental="2"
|
||||||
|
GenerateDebugInformation="true"
|
||||||
|
ProgramDatabaseFile="$(OutDir)/test_lib_json.pdb"
|
||||||
|
SubSystem="1"
|
||||||
|
TargetMachine="1"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManifestTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCAppVerifierTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebDeploymentTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
Description="Running all unit tests"
|
||||||
|
CommandLine="$(TargetPath)"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
<Configuration
|
||||||
|
Name="Release|Win32"
|
||||||
|
OutputDirectory="../../build/vs71/release/test_lib_json"
|
||||||
|
IntermediateDirectory="../../build/vs71/release/test_lib_json"
|
||||||
|
ConfigurationType="1"
|
||||||
|
InheritedPropertySheets="$(VCInstallDir)VCProjectDefaults\UpgradeFromVC71.vsprops"
|
||||||
|
CharacterSet="2"
|
||||||
|
>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreBuildEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCustomBuildTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXMLDataGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebServiceProxyGeneratorTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCMIDLTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCCLCompilerTool"
|
||||||
|
AdditionalIncludeDirectories="../../include"
|
||||||
|
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE"
|
||||||
|
RuntimeLibrary="0"
|
||||||
|
UsePrecompiledHeader="0"
|
||||||
|
WarningLevel="3"
|
||||||
|
Detect64BitPortabilityProblems="true"
|
||||||
|
DebugInformationFormat="3"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManagedResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCResourceCompilerTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPreLinkEventTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCLinkerTool"
|
||||||
|
OutputFile="$(OutDir)/test_lib_json.exe"
|
||||||
|
LinkIncremental="1"
|
||||||
|
GenerateDebugInformation="true"
|
||||||
|
SubSystem="1"
|
||||||
|
OptimizeReferences="2"
|
||||||
|
EnableCOMDATFolding="2"
|
||||||
|
TargetMachine="1"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCALinkTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCManifestTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCXDCMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCBscMakeTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCFxCopTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCAppVerifierTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCWebDeploymentTool"
|
||||||
|
/>
|
||||||
|
<Tool
|
||||||
|
Name="VCPostBuildEventTool"
|
||||||
|
Description="Running all unit tests"
|
||||||
|
CommandLine="$(TargetPath)"
|
||||||
|
/>
|
||||||
|
</Configuration>
|
||||||
|
</Configurations>
|
||||||
|
<References>
|
||||||
|
</References>
|
||||||
|
<Files>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\test_lib_json\jsontest.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\test_lib_json\jsontest.h"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
<File
|
||||||
|
RelativePath="..\..\src\test_lib_json\main.cpp"
|
||||||
|
>
|
||||||
|
</File>
|
||||||
|
</Files>
|
||||||
|
<Globals>
|
||||||
|
</Globals>
|
||||||
|
</VisualStudioProject>
|
368
PowerEditor/src/jsoncpp/makerelease.py
Normal file
368
PowerEditor/src/jsoncpp/makerelease.py
Normal file
@ -0,0 +1,368 @@
|
|||||||
|
"""Tag the sandbox for release, make source and doc tarballs.
|
||||||
|
|
||||||
|
Requires Python 2.6
|
||||||
|
|
||||||
|
Example of invocation (use to test the script):
|
||||||
|
python makerelease.py --force --retag --platform=msvc6,msvc71,msvc80,mingw -ublep 0.5.0 0.6.0-dev
|
||||||
|
|
||||||
|
Example of invocation when doing a release:
|
||||||
|
python makerelease.py 0.5.0 0.6.0-dev
|
||||||
|
"""
|
||||||
|
import os.path
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import doxybuild
|
||||||
|
import subprocess
|
||||||
|
import xml.etree.ElementTree as ElementTree
|
||||||
|
import shutil
|
||||||
|
import urllib2
|
||||||
|
import tempfile
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
from devtools import antglob, fixeol, tarball
|
||||||
|
|
||||||
|
SVN_ROOT = 'https://jsoncpp.svn.sourceforge.net/svnroot/jsoncpp/'
|
||||||
|
SVN_TAG_ROOT = SVN_ROOT + 'tags/jsoncpp'
|
||||||
|
SCONS_LOCAL_URL = 'http://sourceforge.net/projects/scons/files/scons-local/1.2.0/scons-local-1.2.0.tar.gz/download'
|
||||||
|
SOURCEFORGE_PROJECT = 'jsoncpp'
|
||||||
|
|
||||||
|
def set_version( version ):
|
||||||
|
with open('version','wb') as f:
|
||||||
|
f.write( version.strip() )
|
||||||
|
|
||||||
|
def rmdir_if_exist( dir_path ):
|
||||||
|
if os.path.isdir( dir_path ):
|
||||||
|
shutil.rmtree( dir_path )
|
||||||
|
|
||||||
|
class SVNError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def svn_command( command, *args ):
|
||||||
|
cmd = ['svn', '--non-interactive', command] + list(args)
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
process = subprocess.Popen( cmd,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.STDOUT )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
if process.returncode:
|
||||||
|
error = SVNError( 'SVN command failed:\n' + stdout )
|
||||||
|
error.returncode = process.returncode
|
||||||
|
raise error
|
||||||
|
return stdout
|
||||||
|
|
||||||
|
def check_no_pending_commit():
|
||||||
|
"""Checks that there is no pending commit in the sandbox."""
|
||||||
|
stdout = svn_command( 'status', '--xml' )
|
||||||
|
etree = ElementTree.fromstring( stdout )
|
||||||
|
msg = []
|
||||||
|
for entry in etree.getiterator( 'entry' ):
|
||||||
|
path = entry.get('path')
|
||||||
|
status = entry.find('wc-status').get('item')
|
||||||
|
if status != 'unversioned' and path != 'version':
|
||||||
|
msg.append( 'File "%s" has pending change (status="%s")' % (path, status) )
|
||||||
|
if msg:
|
||||||
|
msg.insert(0, 'Pending change to commit found in sandbox. Commit them first!' )
|
||||||
|
return '\n'.join( msg )
|
||||||
|
|
||||||
|
def svn_join_url( base_url, suffix ):
|
||||||
|
if not base_url.endswith('/'):
|
||||||
|
base_url += '/'
|
||||||
|
if suffix.startswith('/'):
|
||||||
|
suffix = suffix[1:]
|
||||||
|
return base_url + suffix
|
||||||
|
|
||||||
|
def svn_check_if_tag_exist( tag_url ):
|
||||||
|
"""Checks if a tag exist.
|
||||||
|
Returns: True if the tag exist, False otherwise.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
list_stdout = svn_command( 'list', tag_url )
|
||||||
|
except SVNError, e:
|
||||||
|
if e.returncode != 1 or not str(e).find('tag_url'):
|
||||||
|
raise e
|
||||||
|
# otherwise ignore error, meaning tag does not exist
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
def svn_commit( message ):
|
||||||
|
"""Commit the sandbox, providing the specified comment.
|
||||||
|
"""
|
||||||
|
svn_command( 'ci', '-m', message )
|
||||||
|
|
||||||
|
def svn_tag_sandbox( tag_url, message ):
|
||||||
|
"""Makes a tag based on the sandbox revisions.
|
||||||
|
"""
|
||||||
|
svn_command( 'copy', '-m', message, '.', tag_url )
|
||||||
|
|
||||||
|
def svn_remove_tag( tag_url, message ):
|
||||||
|
"""Removes an existing tag.
|
||||||
|
"""
|
||||||
|
svn_command( 'delete', '-m', message, tag_url )
|
||||||
|
|
||||||
|
def svn_export( tag_url, export_dir ):
|
||||||
|
"""Exports the tag_url revision to export_dir.
|
||||||
|
Target directory, including its parent is created if it does not exist.
|
||||||
|
If the directory export_dir exist, it is deleted before export proceed.
|
||||||
|
"""
|
||||||
|
rmdir_if_exist( export_dir )
|
||||||
|
svn_command( 'export', tag_url, export_dir )
|
||||||
|
|
||||||
|
def fix_sources_eol( dist_dir ):
|
||||||
|
"""Set file EOL for tarball distribution.
|
||||||
|
"""
|
||||||
|
print 'Preparing exported source file EOL for distribution...'
|
||||||
|
prune_dirs = antglob.prune_dirs + 'scons-local* ./build* ./libs ./dist'
|
||||||
|
win_sources = antglob.glob( dist_dir,
|
||||||
|
includes = '**/*.sln **/*.vcproj',
|
||||||
|
prune_dirs = prune_dirs )
|
||||||
|
unix_sources = antglob.glob( dist_dir,
|
||||||
|
includes = '''**/*.h **/*.cpp **/*.inl **/*.txt **/*.dox **/*.py **/*.html **/*.in
|
||||||
|
sconscript *.json *.expected AUTHORS LICENSE''',
|
||||||
|
excludes = antglob.default_excludes + 'scons.py sconsign.py scons-*',
|
||||||
|
prune_dirs = prune_dirs )
|
||||||
|
for path in win_sources:
|
||||||
|
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\r\n' )
|
||||||
|
for path in unix_sources:
|
||||||
|
fixeol.fix_source_eol( path, is_dry_run = False, verbose = True, eol = '\n' )
|
||||||
|
|
||||||
|
def download( url, target_path ):
|
||||||
|
"""Download file represented by url to target_path.
|
||||||
|
"""
|
||||||
|
f = urllib2.urlopen( url )
|
||||||
|
try:
|
||||||
|
data = f.read()
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
fout = open( target_path, 'wb' )
|
||||||
|
try:
|
||||||
|
fout.write( data )
|
||||||
|
finally:
|
||||||
|
fout.close()
|
||||||
|
|
||||||
|
def check_compile( distcheck_top_dir, platform ):
|
||||||
|
cmd = [sys.executable, 'scons.py', 'platform=%s' % platform, 'check']
|
||||||
|
print 'Running:', ' '.join( cmd )
|
||||||
|
log_path = os.path.join( distcheck_top_dir, 'build-%s.log' % platform )
|
||||||
|
flog = open( log_path, 'wb' )
|
||||||
|
try:
|
||||||
|
process = subprocess.Popen( cmd,
|
||||||
|
stdout=flog,
|
||||||
|
stderr=subprocess.STDOUT,
|
||||||
|
cwd=distcheck_top_dir )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
status = (process.returncode == 0)
|
||||||
|
finally:
|
||||||
|
flog.close()
|
||||||
|
return (status, log_path)
|
||||||
|
|
||||||
|
def write_tempfile( content, **kwargs ):
|
||||||
|
fd, path = tempfile.mkstemp( **kwargs )
|
||||||
|
f = os.fdopen( fd, 'wt' )
|
||||||
|
try:
|
||||||
|
f.write( content )
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
return path
|
||||||
|
|
||||||
|
class SFTPError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run_sftp_batch( userhost, sftp, batch, retry=0 ):
|
||||||
|
path = write_tempfile( batch, suffix='.sftp', text=True )
|
||||||
|
# psftp -agent -C blep,jsoncpp@web.sourceforge.net -batch -b batch.sftp -bc
|
||||||
|
cmd = [sftp, '-agent', '-C', '-batch', '-b', path, '-bc', userhost]
|
||||||
|
error = None
|
||||||
|
for retry_index in xrange(0, max(1,retry)):
|
||||||
|
heading = retry_index == 0 and 'Running:' or 'Retrying:'
|
||||||
|
print heading, ' '.join( cmd )
|
||||||
|
process = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
|
||||||
|
stdout = process.communicate()[0]
|
||||||
|
if process.returncode != 0:
|
||||||
|
error = SFTPError( 'SFTP batch failed:\n' + stdout )
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
if error:
|
||||||
|
raise error
|
||||||
|
return stdout
|
||||||
|
|
||||||
|
def sourceforge_web_synchro( sourceforge_project, doc_dir,
|
||||||
|
user=None, sftp='sftp' ):
|
||||||
|
"""Notes: does not synchronize sub-directory of doc-dir.
|
||||||
|
"""
|
||||||
|
userhost = '%s,%s@web.sourceforge.net' % (user, sourceforge_project)
|
||||||
|
stdout = run_sftp_batch( userhost, sftp, """
|
||||||
|
cd htdocs
|
||||||
|
dir
|
||||||
|
exit
|
||||||
|
""" )
|
||||||
|
existing_paths = set()
|
||||||
|
collect = 0
|
||||||
|
for line in stdout.split('\n'):
|
||||||
|
line = line.strip()
|
||||||
|
if not collect and line.endswith('> dir'):
|
||||||
|
collect = True
|
||||||
|
elif collect and line.endswith('> exit'):
|
||||||
|
break
|
||||||
|
elif collect == 1:
|
||||||
|
collect = 2
|
||||||
|
elif collect == 2:
|
||||||
|
path = line.strip().split()[-1:]
|
||||||
|
if path and path[0] not in ('.', '..'):
|
||||||
|
existing_paths.add( path[0] )
|
||||||
|
upload_paths = set( [os.path.basename(p) for p in antglob.glob( doc_dir )] )
|
||||||
|
paths_to_remove = existing_paths - upload_paths
|
||||||
|
if paths_to_remove:
|
||||||
|
print 'Removing the following file from web:'
|
||||||
|
print '\n'.join( paths_to_remove )
|
||||||
|
stdout = run_sftp_batch( userhost, sftp, """cd htdocs
|
||||||
|
rm %s
|
||||||
|
exit""" % ' '.join(paths_to_remove) )
|
||||||
|
print 'Uploading %d files:' % len(upload_paths)
|
||||||
|
batch_size = 10
|
||||||
|
upload_paths = list(upload_paths)
|
||||||
|
start_time = time.time()
|
||||||
|
for index in xrange(0,len(upload_paths),batch_size):
|
||||||
|
paths = upload_paths[index:index+batch_size]
|
||||||
|
file_per_sec = (time.time() - start_time) / (index+1)
|
||||||
|
remaining_files = len(upload_paths) - index
|
||||||
|
remaining_sec = file_per_sec * remaining_files
|
||||||
|
print '%d/%d, ETA=%.1fs' % (index+1, len(upload_paths), remaining_sec)
|
||||||
|
run_sftp_batch( userhost, sftp, """cd htdocs
|
||||||
|
lcd %s
|
||||||
|
mput %s
|
||||||
|
exit""" % (doc_dir, ' '.join(paths) ), retry=3 )
|
||||||
|
|
||||||
|
def sourceforge_release_tarball( sourceforge_project, paths, user=None, sftp='sftp' ):
|
||||||
|
userhost = '%s,%s@frs.sourceforge.net' % (user, sourceforge_project)
|
||||||
|
run_sftp_batch( userhost, sftp, """
|
||||||
|
mput %s
|
||||||
|
exit
|
||||||
|
""" % (' '.join(paths),) )
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
usage = """%prog release_version next_dev_version
|
||||||
|
Update 'version' file to release_version and commit.
|
||||||
|
Generates the document tarball.
|
||||||
|
Tags the sandbox revision with release_version.
|
||||||
|
Update 'version' file to next_dev_version and commit.
|
||||||
|
|
||||||
|
Performs an svn export of tag release version, and build a source tarball.
|
||||||
|
|
||||||
|
Must be started in the project top directory.
|
||||||
|
|
||||||
|
Warning: --force should only be used when developping/testing the release script.
|
||||||
|
"""
|
||||||
|
from optparse import OptionParser
|
||||||
|
parser = OptionParser(usage=usage)
|
||||||
|
parser.allow_interspersed_args = False
|
||||||
|
parser.add_option('--dot', dest="dot_path", action='store', default=doxybuild.find_program('dot'),
|
||||||
|
help="""Path to GraphViz dot tool. Must be full qualified path. [Default: %default]""")
|
||||||
|
parser.add_option('--doxygen', dest="doxygen_path", action='store', default=doxybuild.find_program('doxygen'),
|
||||||
|
help="""Path to Doxygen tool. [Default: %default]""")
|
||||||
|
parser.add_option('--force', dest="ignore_pending_commit", action='store_true', default=False,
|
||||||
|
help="""Ignore pending commit. [Default: %default]""")
|
||||||
|
parser.add_option('--retag', dest="retag_release", action='store_true', default=False,
|
||||||
|
help="""Overwrite release existing tag if it exist. [Default: %default]""")
|
||||||
|
parser.add_option('-p', '--platforms', dest="platforms", action='store', default='',
|
||||||
|
help="""Comma separated list of platform passed to scons for build check.""")
|
||||||
|
parser.add_option('--no-test', dest="no_test", action='store_true', default=False,
|
||||||
|
help="""Skips build check.""")
|
||||||
|
parser.add_option('--no-web', dest="no_web", action='store_true', default=False,
|
||||||
|
help="""Do not update web site.""")
|
||||||
|
parser.add_option('-u', '--upload-user', dest="user", action='store',
|
||||||
|
help="""Sourceforge user for SFTP documentation upload.""")
|
||||||
|
parser.add_option('--sftp', dest='sftp', action='store', default=doxybuild.find_program('psftp', 'sftp'),
|
||||||
|
help="""Path of the SFTP compatible binary used to upload the documentation.""")
|
||||||
|
parser.enable_interspersed_args()
|
||||||
|
options, args = parser.parse_args()
|
||||||
|
|
||||||
|
if len(args) != 2:
|
||||||
|
parser.error( 'release_version missing on command-line.' )
|
||||||
|
release_version = args[0]
|
||||||
|
next_version = args[1]
|
||||||
|
|
||||||
|
if not options.platforms and not options.no_test:
|
||||||
|
parser.error( 'You must specify either --platform or --no-test option.' )
|
||||||
|
|
||||||
|
if options.ignore_pending_commit:
|
||||||
|
msg = ''
|
||||||
|
else:
|
||||||
|
msg = check_no_pending_commit()
|
||||||
|
if not msg:
|
||||||
|
print 'Setting version to', release_version
|
||||||
|
set_version( release_version )
|
||||||
|
svn_commit( 'Release ' + release_version )
|
||||||
|
tag_url = svn_join_url( SVN_TAG_ROOT, release_version )
|
||||||
|
if svn_check_if_tag_exist( tag_url ):
|
||||||
|
if options.retag_release:
|
||||||
|
svn_remove_tag( tag_url, 'Overwriting previous tag' )
|
||||||
|
else:
|
||||||
|
print 'Aborting, tag %s already exist. Use --retag to overwrite it!' % tag_url
|
||||||
|
sys.exit( 1 )
|
||||||
|
svn_tag_sandbox( tag_url, 'Release ' + release_version )
|
||||||
|
|
||||||
|
print 'Generated doxygen document...'
|
||||||
|
## doc_dirname = r'jsoncpp-api-html-0.5.0'
|
||||||
|
## doc_tarball_path = r'e:\prg\vc\Lib\jsoncpp-trunk\dist\jsoncpp-api-html-0.5.0.tar.gz'
|
||||||
|
doc_tarball_path, doc_dirname = doxybuild.build_doc( options, make_release=True )
|
||||||
|
doc_distcheck_dir = 'dist/doccheck'
|
||||||
|
tarball.decompress( doc_tarball_path, doc_distcheck_dir )
|
||||||
|
doc_distcheck_top_dir = os.path.join( doc_distcheck_dir, doc_dirname )
|
||||||
|
|
||||||
|
export_dir = 'dist/export'
|
||||||
|
svn_export( tag_url, export_dir )
|
||||||
|
fix_sources_eol( export_dir )
|
||||||
|
|
||||||
|
source_dir = 'jsoncpp-src-' + release_version
|
||||||
|
source_tarball_path = 'dist/%s.tar.gz' % source_dir
|
||||||
|
print 'Generating source tarball to', source_tarball_path
|
||||||
|
tarball.make_tarball( source_tarball_path, [export_dir], export_dir, prefix_dir=source_dir )
|
||||||
|
|
||||||
|
# Decompress source tarball, download and install scons-local
|
||||||
|
distcheck_dir = 'dist/distcheck'
|
||||||
|
distcheck_top_dir = distcheck_dir + '/' + source_dir
|
||||||
|
print 'Decompressing source tarball to', distcheck_dir
|
||||||
|
rmdir_if_exist( distcheck_dir )
|
||||||
|
tarball.decompress( source_tarball_path, distcheck_dir )
|
||||||
|
scons_local_path = 'dist/scons-local.tar.gz'
|
||||||
|
print 'Downloading scons-local to', scons_local_path
|
||||||
|
download( SCONS_LOCAL_URL, scons_local_path )
|
||||||
|
print 'Decompressing scons-local to', distcheck_top_dir
|
||||||
|
tarball.decompress( scons_local_path, distcheck_top_dir )
|
||||||
|
|
||||||
|
# Run compilation
|
||||||
|
print 'Compiling decompressed tarball'
|
||||||
|
all_build_status = True
|
||||||
|
for platform in options.platforms.split(','):
|
||||||
|
print 'Testing platform:', platform
|
||||||
|
build_status, log_path = check_compile( distcheck_top_dir, platform )
|
||||||
|
print 'see build log:', log_path
|
||||||
|
print build_status and '=> ok' or '=> FAILED'
|
||||||
|
all_build_status = all_build_status and build_status
|
||||||
|
if not build_status:
|
||||||
|
print 'Testing failed on at least one platform, aborting...'
|
||||||
|
svn_remove_tag( tag_url, 'Removing tag due to failed testing' )
|
||||||
|
sys.exit(1)
|
||||||
|
if options.user:
|
||||||
|
if not options.no_web:
|
||||||
|
print 'Uploading documentation using user', options.user
|
||||||
|
sourceforge_web_synchro( SOURCEFORGE_PROJECT, doc_distcheck_top_dir, user=options.user, sftp=options.sftp )
|
||||||
|
print 'Completed documentation upload'
|
||||||
|
print 'Uploading source and documentation tarballs for release using user', options.user
|
||||||
|
sourceforge_release_tarball( SOURCEFORGE_PROJECT,
|
||||||
|
[source_tarball_path, doc_tarball_path],
|
||||||
|
user=options.user, sftp=options.sftp )
|
||||||
|
print 'Source and doc release tarballs uploaded'
|
||||||
|
else:
|
||||||
|
print 'No upload user specified. Web site and download tarbal were not uploaded.'
|
||||||
|
print 'Tarball can be found at:', doc_tarball_path
|
||||||
|
|
||||||
|
# Set next version number and commit
|
||||||
|
set_version( next_version )
|
||||||
|
svn_commit( 'Released ' + release_version )
|
||||||
|
else:
|
||||||
|
sys.stderr.write( msg + '\n' )
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
53
PowerEditor/src/jsoncpp/scons-tools/globtool.py
Normal file
53
PowerEditor/src/jsoncpp/scons-tools/globtool.py
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
import fnmatch
|
||||||
|
import os
|
||||||
|
|
||||||
|
def generate( env ):
|
||||||
|
def Glob( env, includes = None, excludes = None, dir = '.' ):
|
||||||
|
"""Adds Glob( includes = Split( '*' ), excludes = None, dir = '.')
|
||||||
|
helper function to environment.
|
||||||
|
|
||||||
|
Glob both the file-system files.
|
||||||
|
|
||||||
|
includes: list of file name pattern included in the return list when matched.
|
||||||
|
excludes: list of file name pattern exluced from the return list.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
sources = env.Glob( ("*.cpp", '*.h'), "~*.cpp", "#src" )
|
||||||
|
"""
|
||||||
|
def filterFilename(path):
|
||||||
|
abs_path = os.path.join( dir, path )
|
||||||
|
if not os.path.isfile(abs_path):
|
||||||
|
return 0
|
||||||
|
fn = os.path.basename(path)
|
||||||
|
match = 0
|
||||||
|
for include in includes:
|
||||||
|
if fnmatch.fnmatchcase( fn, include ):
|
||||||
|
match = 1
|
||||||
|
break
|
||||||
|
if match == 1 and not excludes is None:
|
||||||
|
for exclude in excludes:
|
||||||
|
if fnmatch.fnmatchcase( fn, exclude ):
|
||||||
|
match = 0
|
||||||
|
break
|
||||||
|
return match
|
||||||
|
if includes is None:
|
||||||
|
includes = ('*',)
|
||||||
|
elif type(includes) in ( type(''), type(u'') ):
|
||||||
|
includes = (includes,)
|
||||||
|
if type(excludes) in ( type(''), type(u'') ):
|
||||||
|
excludes = (excludes,)
|
||||||
|
dir = env.Dir(dir).abspath
|
||||||
|
paths = os.listdir( dir )
|
||||||
|
def makeAbsFileNode( path ):
|
||||||
|
return env.File( os.path.join( dir, path ) )
|
||||||
|
nodes = filter( filterFilename, paths )
|
||||||
|
return map( makeAbsFileNode, nodes )
|
||||||
|
|
||||||
|
from SCons.Script import Environment
|
||||||
|
Environment.Glob = Glob
|
||||||
|
|
||||||
|
def exists(env):
|
||||||
|
"""
|
||||||
|
Tool always exists.
|
||||||
|
"""
|
||||||
|
return True
|
179
PowerEditor/src/jsoncpp/scons-tools/srcdist.py
Normal file
179
PowerEditor/src/jsoncpp/scons-tools/srcdist.py
Normal file
@ -0,0 +1,179 @@
|
|||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
from fnmatch import fnmatch
|
||||||
|
import targz
|
||||||
|
|
||||||
|
##def DoxyfileParse(file_contents):
|
||||||
|
## """
|
||||||
|
## Parse a Doxygen source file and return a dictionary of all the values.
|
||||||
|
## Values will be strings and lists of strings.
|
||||||
|
## """
|
||||||
|
## data = {}
|
||||||
|
##
|
||||||
|
## import shlex
|
||||||
|
## lex = shlex.shlex(instream = file_contents, posix = True)
|
||||||
|
## lex.wordchars += "*+./-:"
|
||||||
|
## lex.whitespace = lex.whitespace.replace("\n", "")
|
||||||
|
## lex.escape = ""
|
||||||
|
##
|
||||||
|
## lineno = lex.lineno
|
||||||
|
## last_backslash_lineno = lineno
|
||||||
|
## token = lex.get_token()
|
||||||
|
## key = token # the first token should be a key
|
||||||
|
## last_token = ""
|
||||||
|
## key_token = False
|
||||||
|
## next_key = False
|
||||||
|
## new_data = True
|
||||||
|
##
|
||||||
|
## def append_data(data, key, new_data, token):
|
||||||
|
## if new_data or len(data[key]) == 0:
|
||||||
|
## data[key].append(token)
|
||||||
|
## else:
|
||||||
|
## data[key][-1] += token
|
||||||
|
##
|
||||||
|
## while token:
|
||||||
|
## if token in ['\n']:
|
||||||
|
## if last_token not in ['\\']:
|
||||||
|
## key_token = True
|
||||||
|
## elif token in ['\\']:
|
||||||
|
## pass
|
||||||
|
## elif key_token:
|
||||||
|
## key = token
|
||||||
|
## key_token = False
|
||||||
|
## else:
|
||||||
|
## if token == "+=":
|
||||||
|
## if not data.has_key(key):
|
||||||
|
## data[key] = list()
|
||||||
|
## elif token == "=":
|
||||||
|
## data[key] = list()
|
||||||
|
## else:
|
||||||
|
## append_data( data, key, new_data, token )
|
||||||
|
## new_data = True
|
||||||
|
##
|
||||||
|
## last_token = token
|
||||||
|
## token = lex.get_token()
|
||||||
|
##
|
||||||
|
## if last_token == '\\' and token != '\n':
|
||||||
|
## new_data = False
|
||||||
|
## append_data( data, key, new_data, '\\' )
|
||||||
|
##
|
||||||
|
## # compress lists of len 1 into single strings
|
||||||
|
## for (k, v) in data.items():
|
||||||
|
## if len(v) == 0:
|
||||||
|
## data.pop(k)
|
||||||
|
##
|
||||||
|
## # items in the following list will be kept as lists and not converted to strings
|
||||||
|
## if k in ["INPUT", "FILE_PATTERNS", "EXCLUDE_PATTERNS"]:
|
||||||
|
## continue
|
||||||
|
##
|
||||||
|
## if len(v) == 1:
|
||||||
|
## data[k] = v[0]
|
||||||
|
##
|
||||||
|
## return data
|
||||||
|
##
|
||||||
|
##def DoxySourceScan(node, env, path):
|
||||||
|
## """
|
||||||
|
## Doxygen Doxyfile source scanner. This should scan the Doxygen file and add
|
||||||
|
## any files used to generate docs to the list of source files.
|
||||||
|
## """
|
||||||
|
## default_file_patterns = [
|
||||||
|
## '*.c', '*.cc', '*.cxx', '*.cpp', '*.c++', '*.java', '*.ii', '*.ixx',
|
||||||
|
## '*.ipp', '*.i++', '*.inl', '*.h', '*.hh ', '*.hxx', '*.hpp', '*.h++',
|
||||||
|
## '*.idl', '*.odl', '*.cs', '*.php', '*.php3', '*.inc', '*.m', '*.mm',
|
||||||
|
## '*.py',
|
||||||
|
## ]
|
||||||
|
##
|
||||||
|
## default_exclude_patterns = [
|
||||||
|
## '*~',
|
||||||
|
## ]
|
||||||
|
##
|
||||||
|
## sources = []
|
||||||
|
##
|
||||||
|
## data = DoxyfileParse(node.get_contents())
|
||||||
|
##
|
||||||
|
## if data.get("RECURSIVE", "NO") == "YES":
|
||||||
|
## recursive = True
|
||||||
|
## else:
|
||||||
|
## recursive = False
|
||||||
|
##
|
||||||
|
## file_patterns = data.get("FILE_PATTERNS", default_file_patterns)
|
||||||
|
## exclude_patterns = data.get("EXCLUDE_PATTERNS", default_exclude_patterns)
|
||||||
|
##
|
||||||
|
## for node in data.get("INPUT", []):
|
||||||
|
## if os.path.isfile(node):
|
||||||
|
## sources.add(node)
|
||||||
|
## elif os.path.isdir(node):
|
||||||
|
## if recursive:
|
||||||
|
## for root, dirs, files in os.walk(node):
|
||||||
|
## for f in files:
|
||||||
|
## filename = os.path.join(root, f)
|
||||||
|
##
|
||||||
|
## pattern_check = reduce(lambda x, y: x or bool(fnmatch(filename, y)), file_patterns, False)
|
||||||
|
## exclude_check = reduce(lambda x, y: x and fnmatch(filename, y), exclude_patterns, True)
|
||||||
|
##
|
||||||
|
## if pattern_check and not exclude_check:
|
||||||
|
## sources.append(filename)
|
||||||
|
## else:
|
||||||
|
## for pattern in file_patterns:
|
||||||
|
## sources.extend(glob.glob("/".join([node, pattern])))
|
||||||
|
## sources = map( lambda path: env.File(path), sources )
|
||||||
|
## return sources
|
||||||
|
##
|
||||||
|
##
|
||||||
|
##def DoxySourceScanCheck(node, env):
|
||||||
|
## """Check if we should scan this file"""
|
||||||
|
## return os.path.isfile(node.path)
|
||||||
|
|
||||||
|
def srcDistEmitter(source, target, env):
|
||||||
|
## """Doxygen Doxyfile emitter"""
|
||||||
|
## # possible output formats and their default values and output locations
|
||||||
|
## output_formats = {
|
||||||
|
## "HTML": ("YES", "html"),
|
||||||
|
## "LATEX": ("YES", "latex"),
|
||||||
|
## "RTF": ("NO", "rtf"),
|
||||||
|
## "MAN": ("YES", "man"),
|
||||||
|
## "XML": ("NO", "xml"),
|
||||||
|
## }
|
||||||
|
##
|
||||||
|
## data = DoxyfileParse(source[0].get_contents())
|
||||||
|
##
|
||||||
|
## targets = []
|
||||||
|
## out_dir = data.get("OUTPUT_DIRECTORY", ".")
|
||||||
|
##
|
||||||
|
## # add our output locations
|
||||||
|
## for (k, v) in output_formats.items():
|
||||||
|
## if data.get("GENERATE_" + k, v[0]) == "YES":
|
||||||
|
## targets.append(env.Dir( os.path.join(out_dir, data.get(k + "_OUTPUT", v[1]))) )
|
||||||
|
##
|
||||||
|
## # don't clobber targets
|
||||||
|
## for node in targets:
|
||||||
|
## env.Precious(node)
|
||||||
|
##
|
||||||
|
## # set up cleaning stuff
|
||||||
|
## for node in targets:
|
||||||
|
## env.Clean(node, node)
|
||||||
|
##
|
||||||
|
## return (targets, source)
|
||||||
|
return (target,source)
|
||||||
|
|
||||||
|
def generate(env):
|
||||||
|
"""
|
||||||
|
Add builders and construction variables for the
|
||||||
|
SrcDist tool.
|
||||||
|
"""
|
||||||
|
## doxyfile_scanner = env.Scanner(
|
||||||
|
## DoxySourceScan,
|
||||||
|
## "DoxySourceScan",
|
||||||
|
## scan_check = DoxySourceScanCheck,
|
||||||
|
## )
|
||||||
|
|
||||||
|
if targz.exists(env):
|
||||||
|
srcdist_builder = targz.makeBuilder( srcDistEmitter )
|
||||||
|
|
||||||
|
env['BUILDERS']['SrcDist'] = srcdist_builder
|
||||||
|
|
||||||
|
def exists(env):
|
||||||
|
"""
|
||||||
|
Make sure srcdist exists.
|
||||||
|
"""
|
||||||
|
return targz.exists(env)
|
79
PowerEditor/src/jsoncpp/scons-tools/substinfile.py
Normal file
79
PowerEditor/src/jsoncpp/scons-tools/substinfile.py
Normal file
@ -0,0 +1,79 @@
|
|||||||
|
import re
|
||||||
|
from SCons.Script import * # the usual scons stuff you get in a SConscript
|
||||||
|
|
||||||
|
def generate(env):
|
||||||
|
"""
|
||||||
|
Add builders and construction variables for the
|
||||||
|
SubstInFile tool.
|
||||||
|
|
||||||
|
Adds SubstInFile builder, which substitutes the keys->values of SUBST_DICT
|
||||||
|
from the source to the target.
|
||||||
|
The values of SUBST_DICT first have any construction variables expanded
|
||||||
|
(its keys are not expanded).
|
||||||
|
If a value of SUBST_DICT is a python callable function, it is called and
|
||||||
|
the result is expanded as the value.
|
||||||
|
If there's more than one source and more than one target, each target gets
|
||||||
|
substituted from the corresponding source.
|
||||||
|
"""
|
||||||
|
def do_subst_in_file(targetfile, sourcefile, dict):
|
||||||
|
"""Replace all instances of the keys of dict with their values.
|
||||||
|
For example, if dict is {'%VERSION%': '1.2345', '%BASE%': 'MyProg'},
|
||||||
|
then all instances of %VERSION% in the file will be replaced with 1.2345 etc.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
f = open(sourcefile, 'rb')
|
||||||
|
contents = f.read()
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
raise SCons.Errors.UserError, "Can't read source file %s"%sourcefile
|
||||||
|
for (k,v) in dict.items():
|
||||||
|
contents = re.sub(k, v, contents)
|
||||||
|
try:
|
||||||
|
f = open(targetfile, 'wb')
|
||||||
|
f.write(contents)
|
||||||
|
f.close()
|
||||||
|
except:
|
||||||
|
raise SCons.Errors.UserError, "Can't write target file %s"%targetfile
|
||||||
|
return 0 # success
|
||||||
|
|
||||||
|
def subst_in_file(target, source, env):
|
||||||
|
if not env.has_key('SUBST_DICT'):
|
||||||
|
raise SCons.Errors.UserError, "SubstInFile requires SUBST_DICT to be set."
|
||||||
|
d = dict(env['SUBST_DICT']) # copy it
|
||||||
|
for (k,v) in d.items():
|
||||||
|
if callable(v):
|
||||||
|
d[k] = env.subst(v()).replace('\\','\\\\')
|
||||||
|
elif SCons.Util.is_String(v):
|
||||||
|
d[k] = env.subst(v).replace('\\','\\\\')
|
||||||
|
else:
|
||||||
|
raise SCons.Errors.UserError, "SubstInFile: key %s: %s must be a string or callable"%(k, repr(v))
|
||||||
|
for (t,s) in zip(target, source):
|
||||||
|
return do_subst_in_file(str(t), str(s), d)
|
||||||
|
|
||||||
|
def subst_in_file_string(target, source, env):
|
||||||
|
"""This is what gets printed on the console."""
|
||||||
|
return '\n'.join(['Substituting vars from %s into %s'%(str(s), str(t))
|
||||||
|
for (t,s) in zip(target, source)])
|
||||||
|
|
||||||
|
def subst_emitter(target, source, env):
|
||||||
|
"""Add dependency from substituted SUBST_DICT to target.
|
||||||
|
Returns original target, source tuple unchanged.
|
||||||
|
"""
|
||||||
|
d = env['SUBST_DICT'].copy() # copy it
|
||||||
|
for (k,v) in d.items():
|
||||||
|
if callable(v):
|
||||||
|
d[k] = env.subst(v())
|
||||||
|
elif SCons.Util.is_String(v):
|
||||||
|
d[k]=env.subst(v)
|
||||||
|
Depends(target, SCons.Node.Python.Value(d))
|
||||||
|
return target, source
|
||||||
|
|
||||||
|
## env.Append(TOOLS = 'substinfile') # this should be automaticaly done by Scons ?!?
|
||||||
|
subst_action = SCons.Action.Action( subst_in_file, subst_in_file_string )
|
||||||
|
env['BUILDERS']['SubstInFile'] = Builder(action=subst_action, emitter=subst_emitter)
|
||||||
|
|
||||||
|
def exists(env):
|
||||||
|
"""
|
||||||
|
Make sure tool exists.
|
||||||
|
"""
|
||||||
|
return True
|
82
PowerEditor/src/jsoncpp/scons-tools/targz.py
Normal file
82
PowerEditor/src/jsoncpp/scons-tools/targz.py
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
"""tarball
|
||||||
|
|
||||||
|
Tool-specific initialization for tarball.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
## Commands to tackle a command based implementation:
|
||||||
|
##to unpack on the fly...
|
||||||
|
##gunzip < FILE.tar.gz | tar xvf -
|
||||||
|
##to pack on the fly...
|
||||||
|
##tar cvf - FILE-LIST | gzip -c > FILE.tar.gz
|
||||||
|
|
||||||
|
import os.path
|
||||||
|
|
||||||
|
import SCons.Builder
|
||||||
|
import SCons.Node.FS
|
||||||
|
import SCons.Util
|
||||||
|
|
||||||
|
try:
|
||||||
|
import gzip
|
||||||
|
import tarfile
|
||||||
|
internal_targz = 1
|
||||||
|
except ImportError:
|
||||||
|
internal_targz = 0
|
||||||
|
|
||||||
|
TARGZ_DEFAULT_COMPRESSION_LEVEL = 9
|
||||||
|
|
||||||
|
if internal_targz:
|
||||||
|
def targz(target, source, env):
|
||||||
|
def archive_name( path ):
|
||||||
|
path = os.path.normpath( os.path.abspath( path ) )
|
||||||
|
common_path = os.path.commonprefix( (base_dir, path) )
|
||||||
|
archive_name = path[len(common_path):]
|
||||||
|
return archive_name
|
||||||
|
|
||||||
|
def visit(tar, dirname, names):
|
||||||
|
for name in names:
|
||||||
|
path = os.path.join(dirname, name)
|
||||||
|
if os.path.isfile(path):
|
||||||
|
tar.add(path, archive_name(path) )
|
||||||
|
compression = env.get('TARGZ_COMPRESSION_LEVEL',TARGZ_DEFAULT_COMPRESSION_LEVEL)
|
||||||
|
base_dir = os.path.normpath( env.get('TARGZ_BASEDIR', env.Dir('.')).abspath )
|
||||||
|
target_path = str(target[0])
|
||||||
|
fileobj = gzip.GzipFile( target_path, 'wb', compression )
|
||||||
|
tar = tarfile.TarFile(os.path.splitext(target_path)[0], 'w', fileobj)
|
||||||
|
for source in source:
|
||||||
|
source_path = str(source)
|
||||||
|
if source.isdir():
|
||||||
|
os.path.walk(source_path, visit, tar)
|
||||||
|
else:
|
||||||
|
tar.add(source_path, archive_name(source_path) ) # filename, arcname
|
||||||
|
tar.close()
|
||||||
|
|
||||||
|
targzAction = SCons.Action.Action(targz, varlist=['TARGZ_COMPRESSION_LEVEL','TARGZ_BASEDIR'])
|
||||||
|
|
||||||
|
def makeBuilder( emitter = None ):
|
||||||
|
return SCons.Builder.Builder(action = SCons.Action.Action('$TARGZ_COM', '$TARGZ_COMSTR'),
|
||||||
|
source_factory = SCons.Node.FS.Entry,
|
||||||
|
source_scanner = SCons.Defaults.DirScanner,
|
||||||
|
suffix = '$TARGZ_SUFFIX',
|
||||||
|
multi = 1)
|
||||||
|
TarGzBuilder = makeBuilder()
|
||||||
|
|
||||||
|
def generate(env):
|
||||||
|
"""Add Builders and construction variables for zip to an Environment.
|
||||||
|
The following environnement variables may be set:
|
||||||
|
TARGZ_COMPRESSION_LEVEL: integer, [0-9]. 0: no compression, 9: best compression (same as gzip compression level).
|
||||||
|
TARGZ_BASEDIR: base-directory used to determine archive name (this allow archive name to be relative
|
||||||
|
to something other than top-dir).
|
||||||
|
"""
|
||||||
|
env['BUILDERS']['TarGz'] = TarGzBuilder
|
||||||
|
env['TARGZ_COM'] = targzAction
|
||||||
|
env['TARGZ_COMPRESSION_LEVEL'] = TARGZ_DEFAULT_COMPRESSION_LEVEL # range 0-9
|
||||||
|
env['TARGZ_SUFFIX'] = '.tar.gz'
|
||||||
|
env['TARGZ_BASEDIR'] = env.Dir('.') # Sources archive name are made relative to that directory.
|
||||||
|
else:
|
||||||
|
def generate(env):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def exists(env):
|
||||||
|
return internal_targz
|
233
PowerEditor/src/jsoncpp/src/jsontestrunner/main.cpp
Normal file
233
PowerEditor/src/jsoncpp/src/jsontestrunner/main.cpp
Normal file
@ -0,0 +1,233 @@
|
|||||||
|
#include <json/json.h>
|
||||||
|
#include <algorithm> // sort
|
||||||
|
#include <stdio.h>
|
||||||
|
|
||||||
|
#if defined(_MSC_VER) && _MSC_VER >= 1310
|
||||||
|
# pragma warning( disable: 4996 ) // disable fopen deprecation warning
|
||||||
|
#endif
|
||||||
|
|
||||||
|
static std::string
|
||||||
|
readInputTestFile( const char *path )
|
||||||
|
{
|
||||||
|
FILE *file = fopen( path, "rb" );
|
||||||
|
if ( !file )
|
||||||
|
return std::string("");
|
||||||
|
fseek( file, 0, SEEK_END );
|
||||||
|
long size = ftell( file );
|
||||||
|
fseek( file, 0, SEEK_SET );
|
||||||
|
std::string text;
|
||||||
|
char *buffer = new char[size+1];
|
||||||
|
buffer[size] = 0;
|
||||||
|
if ( fread( buffer, 1, size, file ) == (unsigned long)size )
|
||||||
|
text = buffer;
|
||||||
|
fclose( file );
|
||||||
|
delete[] buffer;
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static void
|
||||||
|
printValueTree( FILE *fout, Json::Value &value, const std::string &path = "." )
|
||||||
|
{
|
||||||
|
switch ( value.type() )
|
||||||
|
{
|
||||||
|
case Json::nullValue:
|
||||||
|
fprintf( fout, "%s=null\n", path.c_str() );
|
||||||
|
break;
|
||||||
|
case Json::intValue:
|
||||||
|
fprintf( fout, "%s=%d\n", path.c_str(), value.asInt() );
|
||||||
|
break;
|
||||||
|
case Json::uintValue:
|
||||||
|
fprintf( fout, "%s=%u\n", path.c_str(), value.asUInt() );
|
||||||
|
break;
|
||||||
|
case Json::realValue:
|
||||||
|
fprintf( fout, "%s=%.16g\n", path.c_str(), value.asDouble() );
|
||||||
|
break;
|
||||||
|
case Json::stringValue:
|
||||||
|
fprintf( fout, "%s=\"%s\"\n", path.c_str(), value.asString().c_str() );
|
||||||
|
break;
|
||||||
|
case Json::booleanValue:
|
||||||
|
fprintf( fout, "%s=%s\n", path.c_str(), value.asBool() ? "true" : "false" );
|
||||||
|
break;
|
||||||
|
case Json::arrayValue:
|
||||||
|
{
|
||||||
|
fprintf( fout, "%s=[]\n", path.c_str() );
|
||||||
|
int size = value.size();
|
||||||
|
for ( int index =0; index < size; ++index )
|
||||||
|
{
|
||||||
|
static char buffer[16];
|
||||||
|
sprintf( buffer, "[%d]", index );
|
||||||
|
printValueTree( fout, value[index], path + buffer );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case Json::objectValue:
|
||||||
|
{
|
||||||
|
fprintf( fout, "%s={}\n", path.c_str() );
|
||||||
|
Json::Value::Members members( value.getMemberNames() );
|
||||||
|
std::sort( members.begin(), members.end() );
|
||||||
|
std::string suffix = *(path.end()-1) == '.' ? "" : ".";
|
||||||
|
for ( Json::Value::Members::iterator it = members.begin();
|
||||||
|
it != members.end();
|
||||||
|
++it )
|
||||||
|
{
|
||||||
|
const std::string &name = *it;
|
||||||
|
printValueTree( fout, value[name], path + suffix + name );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static int
|
||||||
|
parseAndSaveValueTree( const std::string &input,
|
||||||
|
const std::string &actual,
|
||||||
|
const std::string &kind,
|
||||||
|
Json::Value &root,
|
||||||
|
const Json::Features &features,
|
||||||
|
bool parseOnly )
|
||||||
|
{
|
||||||
|
Json::Reader reader( features );
|
||||||
|
bool parsingSuccessful = reader.parse( input, root );
|
||||||
|
if ( !parsingSuccessful )
|
||||||
|
{
|
||||||
|
printf( "Failed to parse %s file: \n%s\n",
|
||||||
|
kind.c_str(),
|
||||||
|
reader.getFormatedErrorMessages().c_str() );
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( !parseOnly )
|
||||||
|
{
|
||||||
|
FILE *factual = fopen( actual.c_str(), "wt" );
|
||||||
|
if ( !factual )
|
||||||
|
{
|
||||||
|
printf( "Failed to create %s actual file.\n", kind.c_str() );
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
printValueTree( factual, root );
|
||||||
|
fclose( factual );
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static int
|
||||||
|
rewriteValueTree( const std::string &rewritePath,
|
||||||
|
const Json::Value &root,
|
||||||
|
std::string &rewrite )
|
||||||
|
{
|
||||||
|
//Json::FastWriter writer;
|
||||||
|
//writer.enableYAMLCompatibility();
|
||||||
|
Json::StyledWriter writer;
|
||||||
|
rewrite = writer.write( root );
|
||||||
|
FILE *fout = fopen( rewritePath.c_str(), "wt" );
|
||||||
|
if ( !fout )
|
||||||
|
{
|
||||||
|
printf( "Failed to create rewrite file: %s\n", rewritePath.c_str() );
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
fprintf( fout, "%s\n", rewrite.c_str() );
|
||||||
|
fclose( fout );
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static std::string
|
||||||
|
removeSuffix( const std::string &path,
|
||||||
|
const std::string &extension )
|
||||||
|
{
|
||||||
|
if ( extension.length() >= path.length() )
|
||||||
|
return std::string("");
|
||||||
|
std::string suffix = path.substr( path.length() - extension.length() );
|
||||||
|
if ( suffix != extension )
|
||||||
|
return std::string("");
|
||||||
|
return path.substr( 0, path.length() - extension.length() );
|
||||||
|
}
|
||||||
|
|
||||||
|
static int
|
||||||
|
printUsage( const char *argv[] )
|
||||||
|
{
|
||||||
|
printf( "Usage: %s [--strict] input-json-file", argv[0] );
|
||||||
|
return 3;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int
|
||||||
|
parseCommandLine( int argc, const char *argv[],
|
||||||
|
Json::Features &features, std::string &path,
|
||||||
|
bool &parseOnly )
|
||||||
|
{
|
||||||
|
parseOnly = false;
|
||||||
|
if ( argc < 2 )
|
||||||
|
{
|
||||||
|
return printUsage( argv );
|
||||||
|
}
|
||||||
|
|
||||||
|
int index = 1;
|
||||||
|
if ( std::string(argv[1]) == "--json-checker" )
|
||||||
|
{
|
||||||
|
features = Json::Features::strictMode();
|
||||||
|
parseOnly = true;
|
||||||
|
++index;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( index == argc || index + 1 < argc )
|
||||||
|
{
|
||||||
|
return printUsage( argv );
|
||||||
|
}
|
||||||
|
|
||||||
|
path = argv[index];
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int main( int argc, const char *argv[] )
|
||||||
|
{
|
||||||
|
std::string path;
|
||||||
|
Json::Features features;
|
||||||
|
bool parseOnly;
|
||||||
|
int exitCode = parseCommandLine( argc, argv, features, path, parseOnly );
|
||||||
|
if ( exitCode != 0 )
|
||||||
|
{
|
||||||
|
return exitCode;
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string input = readInputTestFile( path.c_str() );
|
||||||
|
if ( input.empty() )
|
||||||
|
{
|
||||||
|
printf( "Failed to read input or empty input: %s\n", path.c_str() );
|
||||||
|
return 3;
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string basePath = removeSuffix( argv[1], ".json" );
|
||||||
|
if ( !parseOnly && basePath.empty() )
|
||||||
|
{
|
||||||
|
printf( "Bad input path. Path does not end with '.expected':\n%s\n", path.c_str() );
|
||||||
|
return 3;
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string actualPath = basePath + ".actual";
|
||||||
|
std::string rewritePath = basePath + ".rewrite";
|
||||||
|
std::string rewriteActualPath = basePath + ".actual-rewrite";
|
||||||
|
|
||||||
|
Json::Value root;
|
||||||
|
exitCode = parseAndSaveValueTree( input, actualPath, "input", root, features, parseOnly );
|
||||||
|
if ( exitCode == 0 && !parseOnly )
|
||||||
|
{
|
||||||
|
std::string rewrite;
|
||||||
|
exitCode = rewriteValueTree( rewritePath, root, rewrite );
|
||||||
|
if ( exitCode == 0 )
|
||||||
|
{
|
||||||
|
Json::Value rewriteRoot;
|
||||||
|
exitCode = parseAndSaveValueTree( rewrite, rewriteActualPath,
|
||||||
|
"rewrite", rewriteRoot, features, parseOnly );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return exitCode;
|
||||||
|
}
|
||||||
|
|
9
PowerEditor/src/jsoncpp/src/jsontestrunner/sconscript
Normal file
9
PowerEditor/src/jsoncpp/src/jsontestrunner/sconscript
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
Import( 'env_testing buildJSONTests' )
|
||||||
|
|
||||||
|
buildJSONTests( env_testing, Split( """
|
||||||
|
main.cpp
|
||||||
|
""" ),
|
||||||
|
'jsontestrunner' )
|
||||||
|
|
||||||
|
# For 'check' to work, 'libs' must be built first.
|
||||||
|
env_testing.Depends('jsontestrunner', '#libs')
|
125
PowerEditor/src/jsoncpp/src/lib_json/json_batchallocator.h
Normal file
125
PowerEditor/src/jsoncpp/src/lib_json/json_batchallocator.h
Normal file
@ -0,0 +1,125 @@
|
|||||||
|
#ifndef JSONCPP_BATCHALLOCATOR_H_INCLUDED
|
||||||
|
# define JSONCPP_BATCHALLOCATOR_H_INCLUDED
|
||||||
|
|
||||||
|
# include <stdlib.h>
|
||||||
|
# include <assert.h>
|
||||||
|
|
||||||
|
# ifndef JSONCPP_DOC_EXCLUDE_IMPLEMENTATION
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
/* Fast memory allocator.
|
||||||
|
*
|
||||||
|
* This memory allocator allocates memory for a batch of object (specified by
|
||||||
|
* the page size, the number of object in each page).
|
||||||
|
*
|
||||||
|
* It does not allow the destruction of a single object. All the allocated objects
|
||||||
|
* can be destroyed at once. The memory can be either released or reused for future
|
||||||
|
* allocation.
|
||||||
|
*
|
||||||
|
* The in-place new operator must be used to construct the object using the pointer
|
||||||
|
* returned by allocate.
|
||||||
|
*/
|
||||||
|
template<typename AllocatedType
|
||||||
|
,const unsigned int objectPerAllocation>
|
||||||
|
class BatchAllocator
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
typedef AllocatedType Type;
|
||||||
|
|
||||||
|
BatchAllocator( unsigned int objectsPerPage = 255 )
|
||||||
|
: freeHead_( 0 )
|
||||||
|
, objectsPerPage_( objectsPerPage )
|
||||||
|
{
|
||||||
|
// printf( "Size: %d => %s\n", sizeof(AllocatedType), typeid(AllocatedType).name() );
|
||||||
|
assert( sizeof(AllocatedType) * objectPerAllocation >= sizeof(AllocatedType *) ); // We must be able to store a slist in the object free space.
|
||||||
|
assert( objectsPerPage >= 16 );
|
||||||
|
batches_ = allocateBatch( 0 ); // allocated a dummy page
|
||||||
|
currentBatch_ = batches_;
|
||||||
|
}
|
||||||
|
|
||||||
|
~BatchAllocator()
|
||||||
|
{
|
||||||
|
for ( BatchInfo *batch = batches_; batch; )
|
||||||
|
{
|
||||||
|
BatchInfo *nextBatch = batch->next_;
|
||||||
|
free( batch );
|
||||||
|
batch = nextBatch;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// allocate space for an array of objectPerAllocation object.
|
||||||
|
/// @warning it is the responsability of the caller to call objects constructors.
|
||||||
|
AllocatedType *allocate()
|
||||||
|
{
|
||||||
|
if ( freeHead_ ) // returns node from free list.
|
||||||
|
{
|
||||||
|
AllocatedType *object = freeHead_;
|
||||||
|
freeHead_ = *(AllocatedType **)object;
|
||||||
|
return object;
|
||||||
|
}
|
||||||
|
if ( currentBatch_->used_ == currentBatch_->end_ )
|
||||||
|
{
|
||||||
|
currentBatch_ = currentBatch_->next_;
|
||||||
|
while ( currentBatch_ && currentBatch_->used_ == currentBatch_->end_ )
|
||||||
|
currentBatch_ = currentBatch_->next_;
|
||||||
|
|
||||||
|
if ( !currentBatch_ ) // no free batch found, allocate a new one
|
||||||
|
{
|
||||||
|
currentBatch_ = allocateBatch( objectsPerPage_ );
|
||||||
|
currentBatch_->next_ = batches_; // insert at the head of the list
|
||||||
|
batches_ = currentBatch_;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
AllocatedType *allocated = currentBatch_->used_;
|
||||||
|
currentBatch_->used_ += objectPerAllocation;
|
||||||
|
return allocated;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Release the object.
|
||||||
|
/// @warning it is the responsability of the caller to actually destruct the object.
|
||||||
|
void release( AllocatedType *object )
|
||||||
|
{
|
||||||
|
assert( object != 0 );
|
||||||
|
*(AllocatedType **)object = freeHead_;
|
||||||
|
freeHead_ = object;
|
||||||
|
}
|
||||||
|
|
||||||
|
private:
|
||||||
|
struct BatchInfo
|
||||||
|
{
|
||||||
|
BatchInfo *next_;
|
||||||
|
AllocatedType *used_;
|
||||||
|
AllocatedType *end_;
|
||||||
|
AllocatedType buffer_[objectPerAllocation];
|
||||||
|
};
|
||||||
|
|
||||||
|
// disabled copy constructor and assignement operator.
|
||||||
|
BatchAllocator( const BatchAllocator & );
|
||||||
|
void operator =( const BatchAllocator &);
|
||||||
|
|
||||||
|
static BatchInfo *allocateBatch( unsigned int objectsPerPage )
|
||||||
|
{
|
||||||
|
const unsigned int mallocSize = sizeof(BatchInfo) - sizeof(AllocatedType)* objectPerAllocation
|
||||||
|
+ sizeof(AllocatedType) * objectPerAllocation * objectsPerPage;
|
||||||
|
BatchInfo *batch = static_cast<BatchInfo*>( malloc( mallocSize ) );
|
||||||
|
batch->next_ = 0;
|
||||||
|
batch->used_ = batch->buffer_;
|
||||||
|
batch->end_ = batch->buffer_ + objectsPerPage;
|
||||||
|
return batch;
|
||||||
|
}
|
||||||
|
|
||||||
|
BatchInfo *batches_;
|
||||||
|
BatchInfo *currentBatch_;
|
||||||
|
/// Head of a single linked list within the allocated space of freeed object
|
||||||
|
AllocatedType *freeHead_;
|
||||||
|
unsigned int objectsPerPage_;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
} // namespace Json
|
||||||
|
|
||||||
|
# endif // ifndef JSONCPP_DOC_INCLUDE_IMPLEMENTATION
|
||||||
|
|
||||||
|
#endif // JSONCPP_BATCHALLOCATOR_H_INCLUDED
|
||||||
|
|
448
PowerEditor/src/jsoncpp/src/lib_json/json_internalarray.inl
Normal file
448
PowerEditor/src/jsoncpp/src/lib_json/json_internalarray.inl
Normal file
@ -0,0 +1,448 @@
|
|||||||
|
// included by json_value.cpp
|
||||||
|
// everything is within Json namespace
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueInternalArray
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
ValueArrayAllocator::~ValueArrayAllocator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class DefaultValueArrayAllocator
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
|
||||||
|
class DefaultValueArrayAllocator : public ValueArrayAllocator
|
||||||
|
{
|
||||||
|
public: // overridden from ValueArrayAllocator
|
||||||
|
virtual ~DefaultValueArrayAllocator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalArray *newArray()
|
||||||
|
{
|
||||||
|
return new ValueInternalArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalArray *newArrayCopy( const ValueInternalArray &other )
|
||||||
|
{
|
||||||
|
return new ValueInternalArray( other );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void destructArray( ValueInternalArray *array )
|
||||||
|
{
|
||||||
|
delete array;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void reallocateArrayPageIndex( Value **&indexes,
|
||||||
|
ValueInternalArray::PageIndex &indexCount,
|
||||||
|
ValueInternalArray::PageIndex minNewIndexCount )
|
||||||
|
{
|
||||||
|
ValueInternalArray::PageIndex newIndexCount = (indexCount*3)/2 + 1;
|
||||||
|
if ( minNewIndexCount > newIndexCount )
|
||||||
|
newIndexCount = minNewIndexCount;
|
||||||
|
void *newIndexes = realloc( indexes, sizeof(Value*) * newIndexCount );
|
||||||
|
if ( !newIndexes )
|
||||||
|
throw std::bad_alloc();
|
||||||
|
indexCount = newIndexCount;
|
||||||
|
indexes = static_cast<Value **>( newIndexes );
|
||||||
|
}
|
||||||
|
virtual void releaseArrayPageIndex( Value **indexes,
|
||||||
|
ValueInternalArray::PageIndex indexCount )
|
||||||
|
{
|
||||||
|
if ( indexes )
|
||||||
|
free( indexes );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual Value *allocateArrayPage()
|
||||||
|
{
|
||||||
|
return static_cast<Value *>( malloc( sizeof(Value) * ValueInternalArray::itemsPerPage ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseArrayPage( Value *value )
|
||||||
|
{
|
||||||
|
if ( value )
|
||||||
|
free( value );
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
#else // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
|
||||||
|
/// @todo make this thread-safe (lock when accessign batch allocator)
|
||||||
|
class DefaultValueArrayAllocator : public ValueArrayAllocator
|
||||||
|
{
|
||||||
|
public: // overridden from ValueArrayAllocator
|
||||||
|
virtual ~DefaultValueArrayAllocator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalArray *newArray()
|
||||||
|
{
|
||||||
|
ValueInternalArray *array = arraysAllocator_.allocate();
|
||||||
|
new (array) ValueInternalArray(); // placement new
|
||||||
|
return array;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalArray *newArrayCopy( const ValueInternalArray &other )
|
||||||
|
{
|
||||||
|
ValueInternalArray *array = arraysAllocator_.allocate();
|
||||||
|
new (array) ValueInternalArray( other ); // placement new
|
||||||
|
return array;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void destructArray( ValueInternalArray *array )
|
||||||
|
{
|
||||||
|
if ( array )
|
||||||
|
{
|
||||||
|
array->~ValueInternalArray();
|
||||||
|
arraysAllocator_.release( array );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void reallocateArrayPageIndex( Value **&indexes,
|
||||||
|
ValueInternalArray::PageIndex &indexCount,
|
||||||
|
ValueInternalArray::PageIndex minNewIndexCount )
|
||||||
|
{
|
||||||
|
ValueInternalArray::PageIndex newIndexCount = (indexCount*3)/2 + 1;
|
||||||
|
if ( minNewIndexCount > newIndexCount )
|
||||||
|
newIndexCount = minNewIndexCount;
|
||||||
|
void *newIndexes = realloc( indexes, sizeof(Value*) * newIndexCount );
|
||||||
|
if ( !newIndexes )
|
||||||
|
throw std::bad_alloc();
|
||||||
|
indexCount = newIndexCount;
|
||||||
|
indexes = static_cast<Value **>( newIndexes );
|
||||||
|
}
|
||||||
|
virtual void releaseArrayPageIndex( Value **indexes,
|
||||||
|
ValueInternalArray::PageIndex indexCount )
|
||||||
|
{
|
||||||
|
if ( indexes )
|
||||||
|
free( indexes );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual Value *allocateArrayPage()
|
||||||
|
{
|
||||||
|
return static_cast<Value *>( pagesAllocator_.allocate() );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseArrayPage( Value *value )
|
||||||
|
{
|
||||||
|
if ( value )
|
||||||
|
pagesAllocator_.release( value );
|
||||||
|
}
|
||||||
|
private:
|
||||||
|
BatchAllocator<ValueInternalArray,1> arraysAllocator_;
|
||||||
|
BatchAllocator<Value,ValueInternalArray::itemsPerPage> pagesAllocator_;
|
||||||
|
};
|
||||||
|
#endif // #ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
|
||||||
|
|
||||||
|
static ValueArrayAllocator *&arrayAllocator()
|
||||||
|
{
|
||||||
|
static DefaultValueArrayAllocator defaultAllocator;
|
||||||
|
static ValueArrayAllocator *arrayAllocator = &defaultAllocator;
|
||||||
|
return arrayAllocator;
|
||||||
|
}
|
||||||
|
|
||||||
|
static struct DummyArrayAllocatorInitializer {
|
||||||
|
DummyArrayAllocatorInitializer()
|
||||||
|
{
|
||||||
|
arrayAllocator(); // ensure arrayAllocator() statics are initialized before main().
|
||||||
|
}
|
||||||
|
} dummyArrayAllocatorInitializer;
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueInternalArray
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
bool
|
||||||
|
ValueInternalArray::equals( const IteratorState &x,
|
||||||
|
const IteratorState &other )
|
||||||
|
{
|
||||||
|
return x.array_ == other.array_
|
||||||
|
&& x.currentItemIndex_ == other.currentItemIndex_
|
||||||
|
&& x.currentPageIndex_ == other.currentPageIndex_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::increment( IteratorState &it )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( it.array_ &&
|
||||||
|
(it.currentPageIndex_ - it.array_->pages_)*itemsPerPage + it.currentItemIndex_
|
||||||
|
!= it.array_->size_,
|
||||||
|
"ValueInternalArray::increment(): moving iterator beyond end" );
|
||||||
|
++(it.currentItemIndex_);
|
||||||
|
if ( it.currentItemIndex_ == itemsPerPage )
|
||||||
|
{
|
||||||
|
it.currentItemIndex_ = 0;
|
||||||
|
++(it.currentPageIndex_);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::decrement( IteratorState &it )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( it.array_ && it.currentPageIndex_ == it.array_->pages_
|
||||||
|
&& it.currentItemIndex_ == 0,
|
||||||
|
"ValueInternalArray::decrement(): moving iterator beyond end" );
|
||||||
|
if ( it.currentItemIndex_ == 0 )
|
||||||
|
{
|
||||||
|
it.currentItemIndex_ = itemsPerPage-1;
|
||||||
|
--(it.currentPageIndex_);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
--(it.currentItemIndex_);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalArray::unsafeDereference( const IteratorState &it )
|
||||||
|
{
|
||||||
|
return (*(it.currentPageIndex_))[it.currentItemIndex_];
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalArray::dereference( const IteratorState &it )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( it.array_ &&
|
||||||
|
(it.currentPageIndex_ - it.array_->pages_)*itemsPerPage + it.currentItemIndex_
|
||||||
|
< it.array_->size_,
|
||||||
|
"ValueInternalArray::dereference(): dereferencing invalid iterator" );
|
||||||
|
return unsafeDereference( it );
|
||||||
|
}
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::makeBeginIterator( IteratorState &it ) const
|
||||||
|
{
|
||||||
|
it.array_ = const_cast<ValueInternalArray *>( this );
|
||||||
|
it.currentItemIndex_ = 0;
|
||||||
|
it.currentPageIndex_ = pages_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::makeIterator( IteratorState &it, ArrayIndex index ) const
|
||||||
|
{
|
||||||
|
it.array_ = const_cast<ValueInternalArray *>( this );
|
||||||
|
it.currentItemIndex_ = index % itemsPerPage;
|
||||||
|
it.currentPageIndex_ = pages_ + index / itemsPerPage;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::makeEndIterator( IteratorState &it ) const
|
||||||
|
{
|
||||||
|
makeIterator( it, size_ );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalArray::ValueInternalArray()
|
||||||
|
: pages_( 0 )
|
||||||
|
, size_( 0 )
|
||||||
|
, pageCount_( 0 )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalArray::ValueInternalArray( const ValueInternalArray &other )
|
||||||
|
: pages_( 0 )
|
||||||
|
, pageCount_( 0 )
|
||||||
|
, size_( other.size_ )
|
||||||
|
{
|
||||||
|
PageIndex minNewPages = other.size_ / itemsPerPage;
|
||||||
|
arrayAllocator()->reallocateArrayPageIndex( pages_, pageCount_, minNewPages );
|
||||||
|
JSON_ASSERT_MESSAGE( pageCount_ >= minNewPages,
|
||||||
|
"ValueInternalArray::reserve(): bad reallocation" );
|
||||||
|
IteratorState itOther;
|
||||||
|
other.makeBeginIterator( itOther );
|
||||||
|
Value *value;
|
||||||
|
for ( ArrayIndex index = 0; index < size_; ++index, increment(itOther) )
|
||||||
|
{
|
||||||
|
if ( index % itemsPerPage == 0 )
|
||||||
|
{
|
||||||
|
PageIndex pageIndex = index / itemsPerPage;
|
||||||
|
value = arrayAllocator()->allocateArrayPage();
|
||||||
|
pages_[pageIndex] = value;
|
||||||
|
}
|
||||||
|
new (value) Value( dereference( itOther ) );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalArray &
|
||||||
|
ValueInternalArray::operator =( const ValueInternalArray &other )
|
||||||
|
{
|
||||||
|
ValueInternalArray temp( other );
|
||||||
|
swap( temp );
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalArray::~ValueInternalArray()
|
||||||
|
{
|
||||||
|
// destroy all constructed items
|
||||||
|
IteratorState it;
|
||||||
|
IteratorState itEnd;
|
||||||
|
makeBeginIterator( it);
|
||||||
|
makeEndIterator( itEnd );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
Value *value = &dereference(it);
|
||||||
|
value->~Value();
|
||||||
|
}
|
||||||
|
// release all pages
|
||||||
|
PageIndex lastPageIndex = size_ / itemsPerPage;
|
||||||
|
for ( PageIndex pageIndex = 0; pageIndex < lastPageIndex; ++pageIndex )
|
||||||
|
arrayAllocator()->releaseArrayPage( pages_[pageIndex] );
|
||||||
|
// release pages index
|
||||||
|
arrayAllocator()->releaseArrayPageIndex( pages_, pageCount_ );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::swap( ValueInternalArray &other )
|
||||||
|
{
|
||||||
|
Value **tempPages = pages_;
|
||||||
|
pages_ = other.pages_;
|
||||||
|
other.pages_ = tempPages;
|
||||||
|
ArrayIndex tempSize = size_;
|
||||||
|
size_ = other.size_;
|
||||||
|
other.size_ = tempSize;
|
||||||
|
PageIndex tempPageCount = pageCount_;
|
||||||
|
pageCount_ = other.pageCount_;
|
||||||
|
other.pageCount_ = tempPageCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::clear()
|
||||||
|
{
|
||||||
|
ValueInternalArray dummy;
|
||||||
|
swap( dummy );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::resize( ArrayIndex newSize )
|
||||||
|
{
|
||||||
|
if ( newSize == 0 )
|
||||||
|
clear();
|
||||||
|
else if ( newSize < size_ )
|
||||||
|
{
|
||||||
|
IteratorState it;
|
||||||
|
IteratorState itEnd;
|
||||||
|
makeIterator( it, newSize );
|
||||||
|
makeIterator( itEnd, size_ );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
Value *value = &dereference(it);
|
||||||
|
value->~Value();
|
||||||
|
}
|
||||||
|
PageIndex pageIndex = (newSize + itemsPerPage - 1) / itemsPerPage;
|
||||||
|
PageIndex lastPageIndex = size_ / itemsPerPage;
|
||||||
|
for ( ; pageIndex < lastPageIndex; ++pageIndex )
|
||||||
|
arrayAllocator()->releaseArrayPage( pages_[pageIndex] );
|
||||||
|
size_ = newSize;
|
||||||
|
}
|
||||||
|
else if ( newSize > size_ )
|
||||||
|
resolveReference( newSize );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalArray::makeIndexValid( ArrayIndex index )
|
||||||
|
{
|
||||||
|
// Need to enlarge page index ?
|
||||||
|
if ( index >= pageCount_ * itemsPerPage )
|
||||||
|
{
|
||||||
|
PageIndex minNewPages = (index + 1) / itemsPerPage;
|
||||||
|
arrayAllocator()->reallocateArrayPageIndex( pages_, pageCount_, minNewPages );
|
||||||
|
JSON_ASSERT_MESSAGE( pageCount_ >= minNewPages, "ValueInternalArray::reserve(): bad reallocation" );
|
||||||
|
}
|
||||||
|
|
||||||
|
// Need to allocate new pages ?
|
||||||
|
ArrayIndex nextPageIndex =
|
||||||
|
(size_ % itemsPerPage) != 0 ? size_ - (size_%itemsPerPage) + itemsPerPage
|
||||||
|
: size_;
|
||||||
|
if ( nextPageIndex <= index )
|
||||||
|
{
|
||||||
|
PageIndex pageIndex = nextPageIndex / itemsPerPage;
|
||||||
|
PageIndex pageToAllocate = (index - nextPageIndex) / itemsPerPage + 1;
|
||||||
|
for ( ; pageToAllocate-- > 0; ++pageIndex )
|
||||||
|
pages_[pageIndex] = arrayAllocator()->allocateArrayPage();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize all new entries
|
||||||
|
IteratorState it;
|
||||||
|
IteratorState itEnd;
|
||||||
|
makeIterator( it, size_ );
|
||||||
|
size_ = index + 1;
|
||||||
|
makeIterator( itEnd, size_ );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
Value *value = &dereference(it);
|
||||||
|
new (value) Value(); // Construct a default value using placement new
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalArray::resolveReference( ArrayIndex index )
|
||||||
|
{
|
||||||
|
if ( index >= size_ )
|
||||||
|
makeIndexValid( index );
|
||||||
|
return pages_[index/itemsPerPage][index%itemsPerPage];
|
||||||
|
}
|
||||||
|
|
||||||
|
Value *
|
||||||
|
ValueInternalArray::find( ArrayIndex index ) const
|
||||||
|
{
|
||||||
|
if ( index >= size_ )
|
||||||
|
return 0;
|
||||||
|
return &(pages_[index/itemsPerPage][index%itemsPerPage]);
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueInternalArray::ArrayIndex
|
||||||
|
ValueInternalArray::size() const
|
||||||
|
{
|
||||||
|
return size_;
|
||||||
|
}
|
||||||
|
|
||||||
|
int
|
||||||
|
ValueInternalArray::distance( const IteratorState &x, const IteratorState &y )
|
||||||
|
{
|
||||||
|
return indexOf(y) - indexOf(x);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalArray::ArrayIndex
|
||||||
|
ValueInternalArray::indexOf( const IteratorState &iterator )
|
||||||
|
{
|
||||||
|
if ( !iterator.array_ )
|
||||||
|
return ArrayIndex(-1);
|
||||||
|
return ArrayIndex(
|
||||||
|
(iterator.currentPageIndex_ - iterator.array_->pages_) * itemsPerPage
|
||||||
|
+ iterator.currentItemIndex_ );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int
|
||||||
|
ValueInternalArray::compare( const ValueInternalArray &other ) const
|
||||||
|
{
|
||||||
|
int sizeDiff( size_ - other.size_ );
|
||||||
|
if ( sizeDiff != 0 )
|
||||||
|
return sizeDiff;
|
||||||
|
|
||||||
|
for ( ArrayIndex index =0; index < size_; ++index )
|
||||||
|
{
|
||||||
|
int diff = pages_[index/itemsPerPage][index%itemsPerPage].compare(
|
||||||
|
other.pages_[index/itemsPerPage][index%itemsPerPage] );
|
||||||
|
if ( diff != 0 )
|
||||||
|
return diff;
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
}
|
607
PowerEditor/src/jsoncpp/src/lib_json/json_internalmap.inl
Normal file
607
PowerEditor/src/jsoncpp/src/lib_json/json_internalmap.inl
Normal file
@ -0,0 +1,607 @@
|
|||||||
|
// included by json_value.cpp
|
||||||
|
// everything is within Json namespace
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueInternalMap
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
/** \internal MUST be safely initialized using memset( this, 0, sizeof(ValueInternalLink) );
|
||||||
|
* This optimization is used by the fast allocator.
|
||||||
|
*/
|
||||||
|
ValueInternalLink::ValueInternalLink()
|
||||||
|
: previous_( 0 )
|
||||||
|
, next_( 0 )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueInternalLink::~ValueInternalLink()
|
||||||
|
{
|
||||||
|
for ( int index =0; index < itemPerLink; ++index )
|
||||||
|
{
|
||||||
|
if ( !items_[index].isItemAvailable() )
|
||||||
|
{
|
||||||
|
if ( !items_[index].isMemberNameStatic() )
|
||||||
|
free( keys_[index] );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
ValueMapAllocator::~ValueMapAllocator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
#ifdef JSON_USE_SIMPLE_INTERNAL_ALLOCATOR
|
||||||
|
class DefaultValueMapAllocator : public ValueMapAllocator
|
||||||
|
{
|
||||||
|
public: // overridden from ValueMapAllocator
|
||||||
|
virtual ValueInternalMap *newMap()
|
||||||
|
{
|
||||||
|
return new ValueInternalMap();
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalMap *newMapCopy( const ValueInternalMap &other )
|
||||||
|
{
|
||||||
|
return new ValueInternalMap( other );
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void destructMap( ValueInternalMap *map )
|
||||||
|
{
|
||||||
|
delete map;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalLink *allocateMapBuckets( unsigned int size )
|
||||||
|
{
|
||||||
|
return new ValueInternalLink[size];
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseMapBuckets( ValueInternalLink *links )
|
||||||
|
{
|
||||||
|
delete [] links;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalLink *allocateMapLink()
|
||||||
|
{
|
||||||
|
return new ValueInternalLink();
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseMapLink( ValueInternalLink *link )
|
||||||
|
{
|
||||||
|
delete link;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
#else
|
||||||
|
/// @todo make this thread-safe (lock when accessign batch allocator)
|
||||||
|
class DefaultValueMapAllocator : public ValueMapAllocator
|
||||||
|
{
|
||||||
|
public: // overridden from ValueMapAllocator
|
||||||
|
virtual ValueInternalMap *newMap()
|
||||||
|
{
|
||||||
|
ValueInternalMap *map = mapsAllocator_.allocate();
|
||||||
|
new (map) ValueInternalMap(); // placement new
|
||||||
|
return map;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalMap *newMapCopy( const ValueInternalMap &other )
|
||||||
|
{
|
||||||
|
ValueInternalMap *map = mapsAllocator_.allocate();
|
||||||
|
new (map) ValueInternalMap( other ); // placement new
|
||||||
|
return map;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void destructMap( ValueInternalMap *map )
|
||||||
|
{
|
||||||
|
if ( map )
|
||||||
|
{
|
||||||
|
map->~ValueInternalMap();
|
||||||
|
mapsAllocator_.release( map );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalLink *allocateMapBuckets( unsigned int size )
|
||||||
|
{
|
||||||
|
return new ValueInternalLink[size];
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseMapBuckets( ValueInternalLink *links )
|
||||||
|
{
|
||||||
|
delete [] links;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual ValueInternalLink *allocateMapLink()
|
||||||
|
{
|
||||||
|
ValueInternalLink *link = linksAllocator_.allocate();
|
||||||
|
memset( link, 0, sizeof(ValueInternalLink) );
|
||||||
|
return link;
|
||||||
|
}
|
||||||
|
|
||||||
|
virtual void releaseMapLink( ValueInternalLink *link )
|
||||||
|
{
|
||||||
|
link->~ValueInternalLink();
|
||||||
|
linksAllocator_.release( link );
|
||||||
|
}
|
||||||
|
private:
|
||||||
|
BatchAllocator<ValueInternalMap,1> mapsAllocator_;
|
||||||
|
BatchAllocator<ValueInternalLink,1> linksAllocator_;
|
||||||
|
};
|
||||||
|
#endif
|
||||||
|
|
||||||
|
static ValueMapAllocator *&mapAllocator()
|
||||||
|
{
|
||||||
|
static DefaultValueMapAllocator defaultAllocator;
|
||||||
|
static ValueMapAllocator *mapAllocator = &defaultAllocator;
|
||||||
|
return mapAllocator;
|
||||||
|
}
|
||||||
|
|
||||||
|
static struct DummyMapAllocatorInitializer {
|
||||||
|
DummyMapAllocatorInitializer()
|
||||||
|
{
|
||||||
|
mapAllocator(); // ensure mapAllocator() statics are initialized before main().
|
||||||
|
}
|
||||||
|
} dummyMapAllocatorInitializer;
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
// h(K) = value * K >> w ; with w = 32 & K prime w.r.t. 2^32.
|
||||||
|
|
||||||
|
/*
|
||||||
|
use linked list hash map.
|
||||||
|
buckets array is a container.
|
||||||
|
linked list element contains 6 key/values. (memory = (16+4) * 6 + 4 = 124)
|
||||||
|
value have extra state: valid, available, deleted
|
||||||
|
*/
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap::ValueInternalMap()
|
||||||
|
: buckets_( 0 )
|
||||||
|
, tailLink_( 0 )
|
||||||
|
, bucketsSize_( 0 )
|
||||||
|
, itemCount_( 0 )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap::ValueInternalMap( const ValueInternalMap &other )
|
||||||
|
: buckets_( 0 )
|
||||||
|
, tailLink_( 0 )
|
||||||
|
, bucketsSize_( 0 )
|
||||||
|
, itemCount_( 0 )
|
||||||
|
{
|
||||||
|
reserve( other.itemCount_ );
|
||||||
|
IteratorState it;
|
||||||
|
IteratorState itEnd;
|
||||||
|
other.makeBeginIterator( it );
|
||||||
|
other.makeEndIterator( itEnd );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
bool isStatic;
|
||||||
|
const char *memberName = key( it, isStatic );
|
||||||
|
const Value &aValue = value( it );
|
||||||
|
resolveReference(memberName, isStatic) = aValue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap &
|
||||||
|
ValueInternalMap::operator =( const ValueInternalMap &other )
|
||||||
|
{
|
||||||
|
ValueInternalMap dummy( other );
|
||||||
|
swap( dummy );
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap::~ValueInternalMap()
|
||||||
|
{
|
||||||
|
if ( buckets_ )
|
||||||
|
{
|
||||||
|
for ( BucketIndex bucketIndex =0; bucketIndex < bucketsSize_; ++bucketIndex )
|
||||||
|
{
|
||||||
|
ValueInternalLink *link = buckets_[bucketIndex].next_;
|
||||||
|
while ( link )
|
||||||
|
{
|
||||||
|
ValueInternalLink *linkToRelease = link;
|
||||||
|
link = link->next_;
|
||||||
|
mapAllocator()->releaseMapLink( linkToRelease );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
mapAllocator()->releaseMapBuckets( buckets_ );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::swap( ValueInternalMap &other )
|
||||||
|
{
|
||||||
|
ValueInternalLink *tempBuckets = buckets_;
|
||||||
|
buckets_ = other.buckets_;
|
||||||
|
other.buckets_ = tempBuckets;
|
||||||
|
ValueInternalLink *tempTailLink = tailLink_;
|
||||||
|
tailLink_ = other.tailLink_;
|
||||||
|
other.tailLink_ = tempTailLink;
|
||||||
|
BucketIndex tempBucketsSize = bucketsSize_;
|
||||||
|
bucketsSize_ = other.bucketsSize_;
|
||||||
|
other.bucketsSize_ = tempBucketsSize;
|
||||||
|
BucketIndex tempItemCount = itemCount_;
|
||||||
|
itemCount_ = other.itemCount_;
|
||||||
|
other.itemCount_ = tempItemCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::clear()
|
||||||
|
{
|
||||||
|
ValueInternalMap dummy;
|
||||||
|
swap( dummy );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap::BucketIndex
|
||||||
|
ValueInternalMap::size() const
|
||||||
|
{
|
||||||
|
return itemCount_;
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
ValueInternalMap::reserveDelta( BucketIndex growth )
|
||||||
|
{
|
||||||
|
return reserve( itemCount_ + growth );
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
ValueInternalMap::reserve( BucketIndex newItemCount )
|
||||||
|
{
|
||||||
|
if ( !buckets_ && newItemCount > 0 )
|
||||||
|
{
|
||||||
|
buckets_ = mapAllocator()->allocateMapBuckets( 1 );
|
||||||
|
bucketsSize_ = 1;
|
||||||
|
tailLink_ = &buckets_[0];
|
||||||
|
}
|
||||||
|
// BucketIndex idealBucketCount = (newItemCount + ValueInternalLink::itemPerLink) / ValueInternalLink::itemPerLink;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
const Value *
|
||||||
|
ValueInternalMap::find( const char *key ) const
|
||||||
|
{
|
||||||
|
if ( !bucketsSize_ )
|
||||||
|
return 0;
|
||||||
|
HashKey hashedKey = hash( key );
|
||||||
|
BucketIndex bucketIndex = hashedKey % bucketsSize_;
|
||||||
|
for ( const ValueInternalLink *current = &buckets_[bucketIndex];
|
||||||
|
current != 0;
|
||||||
|
current = current->next_ )
|
||||||
|
{
|
||||||
|
for ( BucketIndex index=0; index < ValueInternalLink::itemPerLink; ++index )
|
||||||
|
{
|
||||||
|
if ( current->items_[index].isItemAvailable() )
|
||||||
|
return 0;
|
||||||
|
if ( strcmp( key, current->keys_[index] ) == 0 )
|
||||||
|
return ¤t->items_[index];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value *
|
||||||
|
ValueInternalMap::find( const char *key )
|
||||||
|
{
|
||||||
|
const ValueInternalMap *constThis = this;
|
||||||
|
return const_cast<Value *>( constThis->find( key ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalMap::resolveReference( const char *key,
|
||||||
|
bool isStatic )
|
||||||
|
{
|
||||||
|
HashKey hashedKey = hash( key );
|
||||||
|
if ( bucketsSize_ )
|
||||||
|
{
|
||||||
|
BucketIndex bucketIndex = hashedKey % bucketsSize_;
|
||||||
|
ValueInternalLink **previous = 0;
|
||||||
|
BucketIndex index;
|
||||||
|
for ( ValueInternalLink *current = &buckets_[bucketIndex];
|
||||||
|
current != 0;
|
||||||
|
previous = ¤t->next_, current = current->next_ )
|
||||||
|
{
|
||||||
|
for ( index=0; index < ValueInternalLink::itemPerLink; ++index )
|
||||||
|
{
|
||||||
|
if ( current->items_[index].isItemAvailable() )
|
||||||
|
return setNewItem( key, isStatic, current, index );
|
||||||
|
if ( strcmp( key, current->keys_[index] ) == 0 )
|
||||||
|
return current->items_[index];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
reserveDelta( 1 );
|
||||||
|
return unsafeAdd( key, isStatic, hashedKey );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::remove( const char *key )
|
||||||
|
{
|
||||||
|
HashKey hashedKey = hash( key );
|
||||||
|
if ( !bucketsSize_ )
|
||||||
|
return;
|
||||||
|
BucketIndex bucketIndex = hashedKey % bucketsSize_;
|
||||||
|
for ( ValueInternalLink *link = &buckets_[bucketIndex];
|
||||||
|
link != 0;
|
||||||
|
link = link->next_ )
|
||||||
|
{
|
||||||
|
BucketIndex index;
|
||||||
|
for ( index =0; index < ValueInternalLink::itemPerLink; ++index )
|
||||||
|
{
|
||||||
|
if ( link->items_[index].isItemAvailable() )
|
||||||
|
return;
|
||||||
|
if ( strcmp( key, link->keys_[index] ) == 0 )
|
||||||
|
{
|
||||||
|
doActualRemove( link, index, bucketIndex );
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::doActualRemove( ValueInternalLink *link,
|
||||||
|
BucketIndex index,
|
||||||
|
BucketIndex bucketIndex )
|
||||||
|
{
|
||||||
|
// find last item of the bucket and swap it with the 'removed' one.
|
||||||
|
// set removed items flags to 'available'.
|
||||||
|
// if last page only contains 'available' items, then desallocate it (it's empty)
|
||||||
|
ValueInternalLink *&lastLink = getLastLinkInBucket( index );
|
||||||
|
BucketIndex lastItemIndex = 1; // a link can never be empty, so start at 1
|
||||||
|
for ( ;
|
||||||
|
lastItemIndex < ValueInternalLink::itemPerLink;
|
||||||
|
++lastItemIndex ) // may be optimized with dicotomic search
|
||||||
|
{
|
||||||
|
if ( lastLink->items_[lastItemIndex].isItemAvailable() )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
BucketIndex lastUsedIndex = lastItemIndex - 1;
|
||||||
|
Value *valueToDelete = &link->items_[index];
|
||||||
|
Value *valueToPreserve = &lastLink->items_[lastUsedIndex];
|
||||||
|
if ( valueToDelete != valueToPreserve )
|
||||||
|
valueToDelete->swap( *valueToPreserve );
|
||||||
|
if ( lastUsedIndex == 0 ) // page is now empty
|
||||||
|
{ // remove it from bucket linked list and delete it.
|
||||||
|
ValueInternalLink *linkPreviousToLast = lastLink->previous_;
|
||||||
|
if ( linkPreviousToLast != 0 ) // can not deleted bucket link.
|
||||||
|
{
|
||||||
|
mapAllocator()->releaseMapLink( lastLink );
|
||||||
|
linkPreviousToLast->next_ = 0;
|
||||||
|
lastLink = linkPreviousToLast;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
Value dummy;
|
||||||
|
valueToPreserve->swap( dummy ); // restore deleted to default Value.
|
||||||
|
valueToPreserve->setItemUsed( false );
|
||||||
|
}
|
||||||
|
--itemCount_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalLink *&
|
||||||
|
ValueInternalMap::getLastLinkInBucket( BucketIndex bucketIndex )
|
||||||
|
{
|
||||||
|
if ( bucketIndex == bucketsSize_ - 1 )
|
||||||
|
return tailLink_;
|
||||||
|
ValueInternalLink *&previous = buckets_[bucketIndex+1].previous_;
|
||||||
|
if ( !previous )
|
||||||
|
previous = &buckets_[bucketIndex];
|
||||||
|
return previous;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalMap::setNewItem( const char *key,
|
||||||
|
bool isStatic,
|
||||||
|
ValueInternalLink *link,
|
||||||
|
BucketIndex index )
|
||||||
|
{
|
||||||
|
char *duplicatedKey = valueAllocator()->makeMemberName( key );
|
||||||
|
++itemCount_;
|
||||||
|
link->keys_[index] = duplicatedKey;
|
||||||
|
link->items_[index].setItemUsed();
|
||||||
|
link->items_[index].setMemberNameIsStatic( isStatic );
|
||||||
|
return link->items_[index]; // items already default constructed.
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalMap::unsafeAdd( const char *key,
|
||||||
|
bool isStatic,
|
||||||
|
HashKey hashedKey )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( bucketsSize_ > 0, "ValueInternalMap::unsafeAdd(): internal logic error." );
|
||||||
|
BucketIndex bucketIndex = hashedKey % bucketsSize_;
|
||||||
|
ValueInternalLink *&previousLink = getLastLinkInBucket( bucketIndex );
|
||||||
|
ValueInternalLink *link = previousLink;
|
||||||
|
BucketIndex index;
|
||||||
|
for ( index =0; index < ValueInternalLink::itemPerLink; ++index )
|
||||||
|
{
|
||||||
|
if ( link->items_[index].isItemAvailable() )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if ( index == ValueInternalLink::itemPerLink ) // need to add a new page
|
||||||
|
{
|
||||||
|
ValueInternalLink *newLink = mapAllocator()->allocateMapLink();
|
||||||
|
index = 0;
|
||||||
|
link->next_ = newLink;
|
||||||
|
previousLink = newLink;
|
||||||
|
link = newLink;
|
||||||
|
}
|
||||||
|
return setNewItem( key, isStatic, link, index );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueInternalMap::HashKey
|
||||||
|
ValueInternalMap::hash( const char *key ) const
|
||||||
|
{
|
||||||
|
HashKey hash = 0;
|
||||||
|
while ( *key )
|
||||||
|
hash += *key++ * 37;
|
||||||
|
return hash;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int
|
||||||
|
ValueInternalMap::compare( const ValueInternalMap &other ) const
|
||||||
|
{
|
||||||
|
int sizeDiff( itemCount_ - other.itemCount_ );
|
||||||
|
if ( sizeDiff != 0 )
|
||||||
|
return sizeDiff;
|
||||||
|
// Strict order guaranty is required. Compare all keys FIRST, then compare values.
|
||||||
|
IteratorState it;
|
||||||
|
IteratorState itEnd;
|
||||||
|
makeBeginIterator( it );
|
||||||
|
makeEndIterator( itEnd );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
if ( !other.find( key( it ) ) )
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// All keys are equals, let's compare values
|
||||||
|
makeBeginIterator( it );
|
||||||
|
for ( ; !equals(it,itEnd); increment(it) )
|
||||||
|
{
|
||||||
|
const Value *otherValue = other.find( key( it ) );
|
||||||
|
int valueDiff = value(it).compare( *otherValue );
|
||||||
|
if ( valueDiff != 0 )
|
||||||
|
return valueDiff;
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::makeBeginIterator( IteratorState &it ) const
|
||||||
|
{
|
||||||
|
it.map_ = const_cast<ValueInternalMap *>( this );
|
||||||
|
it.bucketIndex_ = 0;
|
||||||
|
it.itemIndex_ = 0;
|
||||||
|
it.link_ = buckets_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::makeEndIterator( IteratorState &it ) const
|
||||||
|
{
|
||||||
|
it.map_ = const_cast<ValueInternalMap *>( this );
|
||||||
|
it.bucketIndex_ = bucketsSize_;
|
||||||
|
it.itemIndex_ = 0;
|
||||||
|
it.link_ = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
ValueInternalMap::equals( const IteratorState &x, const IteratorState &other )
|
||||||
|
{
|
||||||
|
return x.map_ == other.map_
|
||||||
|
&& x.bucketIndex_ == other.bucketIndex_
|
||||||
|
&& x.link_ == other.link_
|
||||||
|
&& x.itemIndex_ == other.itemIndex_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::incrementBucket( IteratorState &iterator )
|
||||||
|
{
|
||||||
|
++iterator.bucketIndex_;
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.bucketIndex_ <= iterator.map_->bucketsSize_,
|
||||||
|
"ValueInternalMap::increment(): attempting to iterate beyond end." );
|
||||||
|
if ( iterator.bucketIndex_ == iterator.map_->bucketsSize_ )
|
||||||
|
iterator.link_ = 0;
|
||||||
|
else
|
||||||
|
iterator.link_ = &(iterator.map_->buckets_[iterator.bucketIndex_]);
|
||||||
|
iterator.itemIndex_ = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::increment( IteratorState &iterator )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.map_, "Attempting to iterator using invalid iterator." );
|
||||||
|
++iterator.itemIndex_;
|
||||||
|
if ( iterator.itemIndex_ == ValueInternalLink::itemPerLink )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.link_ != 0,
|
||||||
|
"ValueInternalMap::increment(): attempting to iterate beyond end." );
|
||||||
|
iterator.link_ = iterator.link_->next_;
|
||||||
|
if ( iterator.link_ == 0 )
|
||||||
|
incrementBucket( iterator );
|
||||||
|
}
|
||||||
|
else if ( iterator.link_->items_[iterator.itemIndex_].isItemAvailable() )
|
||||||
|
{
|
||||||
|
incrementBucket( iterator );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueInternalMap::decrement( IteratorState &iterator )
|
||||||
|
{
|
||||||
|
if ( iterator.itemIndex_ == 0 )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.map_, "Attempting to iterate using invalid iterator." );
|
||||||
|
if ( iterator.link_ == &iterator.map_->buckets_[iterator.bucketIndex_] )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.bucketIndex_ > 0, "Attempting to iterate beyond beginning." );
|
||||||
|
--(iterator.bucketIndex_);
|
||||||
|
}
|
||||||
|
iterator.link_ = iterator.link_->previous_;
|
||||||
|
iterator.itemIndex_ = ValueInternalLink::itemPerLink - 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
const char *
|
||||||
|
ValueInternalMap::key( const IteratorState &iterator )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
|
||||||
|
return iterator.link_->keys_[iterator.itemIndex_];
|
||||||
|
}
|
||||||
|
|
||||||
|
const char *
|
||||||
|
ValueInternalMap::key( const IteratorState &iterator, bool &isStatic )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
|
||||||
|
isStatic = iterator.link_->items_[iterator.itemIndex_].isMemberNameStatic();
|
||||||
|
return iterator.link_->keys_[iterator.itemIndex_];
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueInternalMap::value( const IteratorState &iterator )
|
||||||
|
{
|
||||||
|
JSON_ASSERT_MESSAGE( iterator.link_, "Attempting to iterate using invalid iterator." );
|
||||||
|
return iterator.link_->items_[iterator.itemIndex_];
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int
|
||||||
|
ValueInternalMap::distance( const IteratorState &x, const IteratorState &y )
|
||||||
|
{
|
||||||
|
int offset = 0;
|
||||||
|
IteratorState it = x;
|
||||||
|
while ( !equals( it, y ) )
|
||||||
|
increment( it );
|
||||||
|
return offset;
|
||||||
|
}
|
885
PowerEditor/src/jsoncpp/src/lib_json/json_reader.cpp
Normal file
885
PowerEditor/src/jsoncpp/src/lib_json/json_reader.cpp
Normal file
@ -0,0 +1,885 @@
|
|||||||
|
#include <json/reader.h>
|
||||||
|
#include <json/value.h>
|
||||||
|
#include <utility>
|
||||||
|
#include <cstdio>
|
||||||
|
#include <cassert>
|
||||||
|
#include <cstring>
|
||||||
|
#include <iostream>
|
||||||
|
#include <stdexcept>
|
||||||
|
|
||||||
|
#if _MSC_VER >= 1400 // VC++ 8.0
|
||||||
|
#pragma warning( disable : 4996 ) // disable warning about strdup being deprecated.
|
||||||
|
#endif
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
// Implementation of class Features
|
||||||
|
// ////////////////////////////////
|
||||||
|
|
||||||
|
Features::Features()
|
||||||
|
: allowComments_( true )
|
||||||
|
, strictRoot_( false )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Features
|
||||||
|
Features::all()
|
||||||
|
{
|
||||||
|
return Features();
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Features
|
||||||
|
Features::strictMode()
|
||||||
|
{
|
||||||
|
Features features;
|
||||||
|
features.allowComments_ = false;
|
||||||
|
features.strictRoot_ = true;
|
||||||
|
return features;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Implementation of class Reader
|
||||||
|
// ////////////////////////////////
|
||||||
|
|
||||||
|
|
||||||
|
static inline bool
|
||||||
|
in( Reader::Char c, Reader::Char c1, Reader::Char c2, Reader::Char c3, Reader::Char c4 )
|
||||||
|
{
|
||||||
|
return c == c1 || c == c2 || c == c3 || c == c4;
|
||||||
|
}
|
||||||
|
|
||||||
|
static inline bool
|
||||||
|
in( Reader::Char c, Reader::Char c1, Reader::Char c2, Reader::Char c3, Reader::Char c4, Reader::Char c5 )
|
||||||
|
{
|
||||||
|
return c == c1 || c == c2 || c == c3 || c == c4 || c == c5;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
static bool
|
||||||
|
containsNewLine( Reader::Location begin,
|
||||||
|
Reader::Location end )
|
||||||
|
{
|
||||||
|
for ( ;begin < end; ++begin )
|
||||||
|
if ( *begin == '\n' || *begin == '\r' )
|
||||||
|
return true;
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
static std::string codePointToUTF8(unsigned int cp)
|
||||||
|
{
|
||||||
|
std::string result;
|
||||||
|
|
||||||
|
// based on description from http://en.wikipedia.org/wiki/UTF-8
|
||||||
|
|
||||||
|
if (cp <= 0x7f)
|
||||||
|
{
|
||||||
|
result.resize(1);
|
||||||
|
result[0] = static_cast<char>(cp);
|
||||||
|
}
|
||||||
|
else if (cp <= 0x7FF)
|
||||||
|
{
|
||||||
|
result.resize(2);
|
||||||
|
result[1] = static_cast<char>(0x80 | (0x3f & cp));
|
||||||
|
result[0] = static_cast<char>(0xC0 | (0x1f & (cp >> 6)));
|
||||||
|
}
|
||||||
|
else if (cp <= 0xFFFF)
|
||||||
|
{
|
||||||
|
result.resize(3);
|
||||||
|
result[2] = static_cast<char>(0x80 | (0x3f & cp));
|
||||||
|
result[1] = 0x80 | static_cast<char>((0x3f & (cp >> 6)));
|
||||||
|
result[0] = 0xE0 | static_cast<char>((0xf & (cp >> 12)));
|
||||||
|
}
|
||||||
|
else if (cp <= 0x10FFFF)
|
||||||
|
{
|
||||||
|
result.resize(4);
|
||||||
|
result[3] = static_cast<char>(0x80 | (0x3f & cp));
|
||||||
|
result[2] = static_cast<char>(0x80 | (0x3f & (cp >> 6)));
|
||||||
|
result[1] = static_cast<char>(0x80 | (0x3f & (cp >> 12)));
|
||||||
|
result[0] = static_cast<char>(0xF0 | (0x7 & (cp >> 18)));
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// Class Reader
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
Reader::Reader()
|
||||||
|
: features_( Features::all() )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Reader::Reader( const Features &features )
|
||||||
|
: features_( features )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::parse( const std::string &document,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments )
|
||||||
|
{
|
||||||
|
document_ = document;
|
||||||
|
const char *begin = document_.c_str();
|
||||||
|
const char *end = begin + document_.length();
|
||||||
|
return parse( begin, end, root, collectComments );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::parse( std::istream& sin,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments )
|
||||||
|
{
|
||||||
|
//std::istream_iterator<char> begin(sin);
|
||||||
|
//std::istream_iterator<char> end;
|
||||||
|
// Those would allow streamed input from a file, if parse() were a
|
||||||
|
// template function.
|
||||||
|
|
||||||
|
// Since std::string is reference-counted, this at least does not
|
||||||
|
// create an extra copy.
|
||||||
|
std::string doc;
|
||||||
|
std::getline(sin, doc, (char)EOF);
|
||||||
|
return parse( doc, root, collectComments );
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::parse( const char *beginDoc, const char *endDoc,
|
||||||
|
Value &root,
|
||||||
|
bool collectComments )
|
||||||
|
{
|
||||||
|
if ( !features_.allowComments_ )
|
||||||
|
{
|
||||||
|
collectComments = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
begin_ = beginDoc;
|
||||||
|
end_ = endDoc;
|
||||||
|
collectComments_ = collectComments;
|
||||||
|
current_ = begin_;
|
||||||
|
lastValueEnd_ = 0;
|
||||||
|
lastValue_ = 0;
|
||||||
|
commentsBefore_ = "";
|
||||||
|
errors_.clear();
|
||||||
|
while ( !nodes_.empty() )
|
||||||
|
nodes_.pop();
|
||||||
|
nodes_.push( &root );
|
||||||
|
|
||||||
|
bool successful = readValue();
|
||||||
|
Token token;
|
||||||
|
skipCommentTokens( token );
|
||||||
|
if ( collectComments_ && !commentsBefore_.empty() )
|
||||||
|
root.setComment( commentsBefore_, commentAfter );
|
||||||
|
if ( features_.strictRoot_ )
|
||||||
|
{
|
||||||
|
if ( !root.isArray() && !root.isObject() )
|
||||||
|
{
|
||||||
|
// Set error location to start of doc, ideally should be first token found in doc
|
||||||
|
token.type_ = tokenError;
|
||||||
|
token.start_ = beginDoc;
|
||||||
|
token.end_ = endDoc;
|
||||||
|
addError( "A valid JSON document must be either an array or an object value.",
|
||||||
|
token );
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return successful;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readValue()
|
||||||
|
{
|
||||||
|
Token token;
|
||||||
|
skipCommentTokens( token );
|
||||||
|
bool successful = true;
|
||||||
|
|
||||||
|
if ( collectComments_ && !commentsBefore_.empty() )
|
||||||
|
{
|
||||||
|
currentValue().setComment( commentsBefore_, commentBefore );
|
||||||
|
commentsBefore_ = "";
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
switch ( token.type_ )
|
||||||
|
{
|
||||||
|
case tokenObjectBegin:
|
||||||
|
successful = readObject( token );
|
||||||
|
break;
|
||||||
|
case tokenArrayBegin:
|
||||||
|
successful = readArray( token );
|
||||||
|
break;
|
||||||
|
case tokenNumber:
|
||||||
|
successful = decodeNumber( token );
|
||||||
|
break;
|
||||||
|
case tokenString:
|
||||||
|
successful = decodeString( token );
|
||||||
|
break;
|
||||||
|
case tokenTrue:
|
||||||
|
currentValue() = true;
|
||||||
|
break;
|
||||||
|
case tokenFalse:
|
||||||
|
currentValue() = false;
|
||||||
|
break;
|
||||||
|
case tokenNull:
|
||||||
|
currentValue() = Value();
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
return addError( "Syntax error: value, object or array expected.", token );
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( collectComments_ )
|
||||||
|
{
|
||||||
|
lastValueEnd_ = current_;
|
||||||
|
lastValue_ = ¤tValue();
|
||||||
|
}
|
||||||
|
|
||||||
|
return successful;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Reader::skipCommentTokens( Token &token )
|
||||||
|
{
|
||||||
|
if ( features_.allowComments_ )
|
||||||
|
{
|
||||||
|
do
|
||||||
|
{
|
||||||
|
readToken( token );
|
||||||
|
}
|
||||||
|
while ( token.type_ == tokenComment );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
readToken( token );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::expectToken( TokenType type, Token &token, const char *message )
|
||||||
|
{
|
||||||
|
readToken( token );
|
||||||
|
if ( token.type_ != type )
|
||||||
|
return addError( message, token );
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readToken( Token &token )
|
||||||
|
{
|
||||||
|
skipSpaces();
|
||||||
|
token.start_ = current_;
|
||||||
|
Char c = getNextChar();
|
||||||
|
bool ok = true;
|
||||||
|
switch ( c )
|
||||||
|
{
|
||||||
|
case '{':
|
||||||
|
token.type_ = tokenObjectBegin;
|
||||||
|
break;
|
||||||
|
case '}':
|
||||||
|
token.type_ = tokenObjectEnd;
|
||||||
|
break;
|
||||||
|
case '[':
|
||||||
|
token.type_ = tokenArrayBegin;
|
||||||
|
break;
|
||||||
|
case ']':
|
||||||
|
token.type_ = tokenArrayEnd;
|
||||||
|
break;
|
||||||
|
case '"':
|
||||||
|
token.type_ = tokenString;
|
||||||
|
ok = readString();
|
||||||
|
break;
|
||||||
|
case '/':
|
||||||
|
token.type_ = tokenComment;
|
||||||
|
ok = readComment();
|
||||||
|
break;
|
||||||
|
case '0':
|
||||||
|
case '1':
|
||||||
|
case '2':
|
||||||
|
case '3':
|
||||||
|
case '4':
|
||||||
|
case '5':
|
||||||
|
case '6':
|
||||||
|
case '7':
|
||||||
|
case '8':
|
||||||
|
case '9':
|
||||||
|
case '-':
|
||||||
|
token.type_ = tokenNumber;
|
||||||
|
readNumber();
|
||||||
|
break;
|
||||||
|
case 't':
|
||||||
|
token.type_ = tokenTrue;
|
||||||
|
ok = match( "rue", 3 );
|
||||||
|
break;
|
||||||
|
case 'f':
|
||||||
|
token.type_ = tokenFalse;
|
||||||
|
ok = match( "alse", 4 );
|
||||||
|
break;
|
||||||
|
case 'n':
|
||||||
|
token.type_ = tokenNull;
|
||||||
|
ok = match( "ull", 3 );
|
||||||
|
break;
|
||||||
|
case ',':
|
||||||
|
token.type_ = tokenArraySeparator;
|
||||||
|
break;
|
||||||
|
case ':':
|
||||||
|
token.type_ = tokenMemberSeparator;
|
||||||
|
break;
|
||||||
|
case 0:
|
||||||
|
token.type_ = tokenEndOfStream;
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
ok = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if ( !ok )
|
||||||
|
token.type_ = tokenError;
|
||||||
|
token.end_ = current_;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Reader::skipSpaces()
|
||||||
|
{
|
||||||
|
while ( current_ != end_ )
|
||||||
|
{
|
||||||
|
Char c = *current_;
|
||||||
|
if ( c == ' ' || c == '\t' || c == '\r' || c == '\n' )
|
||||||
|
++current_;
|
||||||
|
else
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::match( Location pattern,
|
||||||
|
int patternLength )
|
||||||
|
{
|
||||||
|
if ( end_ - current_ < patternLength )
|
||||||
|
return false;
|
||||||
|
int index = patternLength;
|
||||||
|
while ( index-- )
|
||||||
|
if ( current_[index] != pattern[index] )
|
||||||
|
return false;
|
||||||
|
current_ += patternLength;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readComment()
|
||||||
|
{
|
||||||
|
Location commentBegin = current_ - 1;
|
||||||
|
Char c = getNextChar();
|
||||||
|
bool successful = false;
|
||||||
|
if ( c == '*' )
|
||||||
|
successful = readCStyleComment();
|
||||||
|
else if ( c == '/' )
|
||||||
|
successful = readCppStyleComment();
|
||||||
|
if ( !successful )
|
||||||
|
return false;
|
||||||
|
|
||||||
|
if ( collectComments_ )
|
||||||
|
{
|
||||||
|
CommentPlacement placement = commentBefore;
|
||||||
|
if ( lastValueEnd_ && !containsNewLine( lastValueEnd_, commentBegin ) )
|
||||||
|
{
|
||||||
|
if ( c != '*' || !containsNewLine( commentBegin, current_ ) )
|
||||||
|
placement = commentAfterOnSameLine;
|
||||||
|
}
|
||||||
|
|
||||||
|
addComment( commentBegin, current_, placement );
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Reader::addComment( Location begin,
|
||||||
|
Location end,
|
||||||
|
CommentPlacement placement )
|
||||||
|
{
|
||||||
|
assert( collectComments_ );
|
||||||
|
if ( placement == commentAfterOnSameLine )
|
||||||
|
{
|
||||||
|
assert( lastValue_ != 0 );
|
||||||
|
lastValue_->setComment( std::string( begin, end ), placement );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
if ( !commentsBefore_.empty() )
|
||||||
|
commentsBefore_ += "\n";
|
||||||
|
commentsBefore_ += std::string( begin, end );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readCStyleComment()
|
||||||
|
{
|
||||||
|
while ( current_ != end_ )
|
||||||
|
{
|
||||||
|
Char c = getNextChar();
|
||||||
|
if ( c == '*' && *current_ == '/' )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return getNextChar() == '/';
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readCppStyleComment()
|
||||||
|
{
|
||||||
|
while ( current_ != end_ )
|
||||||
|
{
|
||||||
|
Char c = getNextChar();
|
||||||
|
if ( c == '\r' || c == '\n' )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Reader::readNumber()
|
||||||
|
{
|
||||||
|
while ( current_ != end_ )
|
||||||
|
{
|
||||||
|
if ( !(*current_ >= '0' && *current_ <= '9') &&
|
||||||
|
!in( *current_, '.', 'e', 'E', '+', '-' ) )
|
||||||
|
break;
|
||||||
|
++current_;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readString()
|
||||||
|
{
|
||||||
|
Char c = 0;
|
||||||
|
while ( current_ != end_ )
|
||||||
|
{
|
||||||
|
c = getNextChar();
|
||||||
|
if ( c == '\\' )
|
||||||
|
getNextChar();
|
||||||
|
else if ( c == '"' )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return c == '"';
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readObject( Token &tokenStart )
|
||||||
|
{
|
||||||
|
Token tokenName;
|
||||||
|
std::string name;
|
||||||
|
currentValue() = Value( objectValue );
|
||||||
|
while ( readToken( tokenName ) )
|
||||||
|
{
|
||||||
|
bool initialTokenOk = true;
|
||||||
|
while ( tokenName.type_ == tokenComment && initialTokenOk )
|
||||||
|
initialTokenOk = readToken( tokenName );
|
||||||
|
if ( !initialTokenOk )
|
||||||
|
break;
|
||||||
|
if ( tokenName.type_ == tokenObjectEnd && name.empty() ) // empty object
|
||||||
|
return true;
|
||||||
|
if ( tokenName.type_ != tokenString )
|
||||||
|
break;
|
||||||
|
|
||||||
|
name = "";
|
||||||
|
if ( !decodeString( tokenName, name ) )
|
||||||
|
return recoverFromError( tokenObjectEnd );
|
||||||
|
|
||||||
|
Token colon;
|
||||||
|
if ( !readToken( colon ) || colon.type_ != tokenMemberSeparator )
|
||||||
|
{
|
||||||
|
return addErrorAndRecover( "Missing ':' after object member name",
|
||||||
|
colon,
|
||||||
|
tokenObjectEnd );
|
||||||
|
}
|
||||||
|
Value &value = currentValue()[ name ];
|
||||||
|
nodes_.push( &value );
|
||||||
|
bool ok = readValue();
|
||||||
|
nodes_.pop();
|
||||||
|
if ( !ok ) // error already set
|
||||||
|
return recoverFromError( tokenObjectEnd );
|
||||||
|
|
||||||
|
Token comma;
|
||||||
|
if ( !readToken( comma )
|
||||||
|
|| ( comma.type_ != tokenObjectEnd &&
|
||||||
|
comma.type_ != tokenArraySeparator &&
|
||||||
|
comma.type_ != tokenComment ) )
|
||||||
|
{
|
||||||
|
return addErrorAndRecover( "Missing ',' or '}' in object declaration",
|
||||||
|
comma,
|
||||||
|
tokenObjectEnd );
|
||||||
|
}
|
||||||
|
bool finalizeTokenOk = true;
|
||||||
|
while ( comma.type_ == tokenComment &&
|
||||||
|
finalizeTokenOk )
|
||||||
|
finalizeTokenOk = readToken( comma );
|
||||||
|
if ( comma.type_ == tokenObjectEnd )
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return addErrorAndRecover( "Missing '}' or object member name",
|
||||||
|
tokenName,
|
||||||
|
tokenObjectEnd );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::readArray( Token &tokenStart )
|
||||||
|
{
|
||||||
|
currentValue() = Value( arrayValue );
|
||||||
|
skipSpaces();
|
||||||
|
if ( *current_ == ']' ) // empty array
|
||||||
|
{
|
||||||
|
Token endArray;
|
||||||
|
readToken( endArray );
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
int index = 0;
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
Value &value = currentValue()[ index++ ];
|
||||||
|
nodes_.push( &value );
|
||||||
|
bool ok = readValue();
|
||||||
|
nodes_.pop();
|
||||||
|
if ( !ok ) // error already set
|
||||||
|
return recoverFromError( tokenArrayEnd );
|
||||||
|
|
||||||
|
Token token;
|
||||||
|
// Accept Comment after last item in the array.
|
||||||
|
ok = readToken( token );
|
||||||
|
while ( token.type_ == tokenComment && ok )
|
||||||
|
{
|
||||||
|
ok = readToken( token );
|
||||||
|
}
|
||||||
|
bool badTokenType = ( token.type_ == tokenArraySeparator &&
|
||||||
|
token.type_ == tokenArrayEnd );
|
||||||
|
if ( !ok || badTokenType )
|
||||||
|
{
|
||||||
|
return addErrorAndRecover( "Missing ',' or ']' in array declaration",
|
||||||
|
token,
|
||||||
|
tokenArrayEnd );
|
||||||
|
}
|
||||||
|
if ( token.type_ == tokenArrayEnd )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeNumber( Token &token )
|
||||||
|
{
|
||||||
|
bool isDouble = false;
|
||||||
|
for ( Location inspect = token.start_; inspect != token.end_; ++inspect )
|
||||||
|
{
|
||||||
|
isDouble = isDouble
|
||||||
|
|| in( *inspect, '.', 'e', 'E', '+' )
|
||||||
|
|| ( *inspect == '-' && inspect != token.start_ );
|
||||||
|
}
|
||||||
|
if ( isDouble )
|
||||||
|
return decodeDouble( token );
|
||||||
|
Location current = token.start_;
|
||||||
|
bool isNegative = *current == '-';
|
||||||
|
if ( isNegative )
|
||||||
|
++current;
|
||||||
|
Value::UInt threshold = (isNegative ? Value::UInt(-Value::minInt)
|
||||||
|
: Value::maxUInt) / 10;
|
||||||
|
Value::UInt value = 0;
|
||||||
|
while ( current < token.end_ )
|
||||||
|
{
|
||||||
|
Char c = *current++;
|
||||||
|
if ( c < '0' || c > '9' )
|
||||||
|
return addError( "'" + std::string( token.start_, token.end_ ) + "' is not a number.", token );
|
||||||
|
if ( value >= threshold )
|
||||||
|
return decodeDouble( token );
|
||||||
|
value = value * 10 + Value::UInt(c - '0');
|
||||||
|
}
|
||||||
|
if ( isNegative )
|
||||||
|
currentValue() = -Value::Int( value );
|
||||||
|
else if ( value <= Value::UInt(Value::maxInt) )
|
||||||
|
currentValue() = Value::Int( value );
|
||||||
|
else
|
||||||
|
currentValue() = value;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeDouble( Token &token )
|
||||||
|
{
|
||||||
|
double value = 0;
|
||||||
|
const int bufferSize = 32;
|
||||||
|
int count;
|
||||||
|
int length = int(token.end_ - token.start_);
|
||||||
|
if ( length <= bufferSize )
|
||||||
|
{
|
||||||
|
Char buffer[bufferSize];
|
||||||
|
memcpy( buffer, token.start_, length );
|
||||||
|
buffer[length] = 0;
|
||||||
|
count = sscanf( buffer, "%lf", &value );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
std::string buffer( token.start_, token.end_ );
|
||||||
|
count = sscanf( buffer.c_str(), "%lf", &value );
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( count != 1 )
|
||||||
|
return addError( "'" + std::string( token.start_, token.end_ ) + "' is not a number.", token );
|
||||||
|
currentValue() = value;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeString( Token &token )
|
||||||
|
{
|
||||||
|
std::string decoded;
|
||||||
|
if ( !decodeString( token, decoded ) )
|
||||||
|
return false;
|
||||||
|
currentValue() = decoded;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeString( Token &token, std::string &decoded )
|
||||||
|
{
|
||||||
|
decoded.reserve( token.end_ - token.start_ - 2 );
|
||||||
|
Location current = token.start_ + 1; // skip '"'
|
||||||
|
Location end = token.end_ - 1; // do not include '"'
|
||||||
|
while ( current != end )
|
||||||
|
{
|
||||||
|
Char c = *current++;
|
||||||
|
if ( c == '"' )
|
||||||
|
break;
|
||||||
|
else if ( c == '\\' )
|
||||||
|
{
|
||||||
|
if ( current == end )
|
||||||
|
return addError( "Empty escape sequence in string", token, current );
|
||||||
|
Char escape = *current++;
|
||||||
|
switch ( escape )
|
||||||
|
{
|
||||||
|
case '"': decoded += '"'; break;
|
||||||
|
case '/': decoded += '/'; break;
|
||||||
|
case '\\': decoded += '\\'; break;
|
||||||
|
case 'b': decoded += '\b'; break;
|
||||||
|
case 'f': decoded += '\f'; break;
|
||||||
|
case 'n': decoded += '\n'; break;
|
||||||
|
case 'r': decoded += '\r'; break;
|
||||||
|
case 't': decoded += '\t'; break;
|
||||||
|
case 'u':
|
||||||
|
{
|
||||||
|
unsigned int unicode;
|
||||||
|
if ( !decodeUnicodeCodePoint( token, current, end, unicode ) )
|
||||||
|
return false;
|
||||||
|
decoded += codePointToUTF8(unicode);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
return addError( "Bad escape sequence in string", token, current );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
decoded += c;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeUnicodeCodePoint( Token &token,
|
||||||
|
Location ¤t,
|
||||||
|
Location end,
|
||||||
|
unsigned int &unicode )
|
||||||
|
{
|
||||||
|
|
||||||
|
if ( !decodeUnicodeEscapeSequence( token, current, end, unicode ) )
|
||||||
|
return false;
|
||||||
|
if (unicode >= 0xD800 && unicode <= 0xDBFF)
|
||||||
|
{
|
||||||
|
// surrogate pairs
|
||||||
|
if (end - current < 6)
|
||||||
|
return addError( "additional six characters expected to parse unicode surrogate pair.", token, current );
|
||||||
|
unsigned int surrogatePair;
|
||||||
|
if (*(current++) == '\\' && *(current++)== 'u')
|
||||||
|
{
|
||||||
|
if (decodeUnicodeEscapeSequence( token, current, end, surrogatePair ))
|
||||||
|
{
|
||||||
|
unicode = 0x10000 + ((unicode & 0x3FF) << 10) + (surrogatePair & 0x3FF);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
return addError( "expecting another \\u token to begin the second half of a unicode surrogate pair", token, current );
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::decodeUnicodeEscapeSequence( Token &token,
|
||||||
|
Location ¤t,
|
||||||
|
Location end,
|
||||||
|
unsigned int &unicode )
|
||||||
|
{
|
||||||
|
if ( end - current < 4 )
|
||||||
|
return addError( "Bad unicode escape sequence in string: four digits expected.", token, current );
|
||||||
|
unicode = 0;
|
||||||
|
for ( int index =0; index < 4; ++index )
|
||||||
|
{
|
||||||
|
Char c = *current++;
|
||||||
|
unicode *= 16;
|
||||||
|
if ( c >= '0' && c <= '9' )
|
||||||
|
unicode += c - '0';
|
||||||
|
else if ( c >= 'a' && c <= 'f' )
|
||||||
|
unicode += c - 'a' + 10;
|
||||||
|
else if ( c >= 'A' && c <= 'F' )
|
||||||
|
unicode += c - 'A' + 10;
|
||||||
|
else
|
||||||
|
return addError( "Bad unicode escape sequence in string: hexadecimal digit expected.", token, current );
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::addError( const std::string &message,
|
||||||
|
Token &token,
|
||||||
|
Location extra )
|
||||||
|
{
|
||||||
|
ErrorInfo info;
|
||||||
|
info.token_ = token;
|
||||||
|
info.message_ = message;
|
||||||
|
info.extra_ = extra;
|
||||||
|
errors_.push_back( info );
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::recoverFromError( TokenType skipUntilToken )
|
||||||
|
{
|
||||||
|
int errorCount = int(errors_.size());
|
||||||
|
Token skip;
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
if ( !readToken(skip) )
|
||||||
|
errors_.resize( errorCount ); // discard errors caused by recovery
|
||||||
|
if ( skip.type_ == skipUntilToken || skip.type_ == tokenEndOfStream )
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
errors_.resize( errorCount );
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Reader::addErrorAndRecover( const std::string &message,
|
||||||
|
Token &token,
|
||||||
|
TokenType skipUntilToken )
|
||||||
|
{
|
||||||
|
addError( message, token );
|
||||||
|
return recoverFromError( skipUntilToken );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value &
|
||||||
|
Reader::currentValue()
|
||||||
|
{
|
||||||
|
return *(nodes_.top());
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Reader::Char
|
||||||
|
Reader::getNextChar()
|
||||||
|
{
|
||||||
|
if ( current_ == end_ )
|
||||||
|
return 0;
|
||||||
|
return *current_++;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Reader::getLocationLineAndColumn( Location location,
|
||||||
|
int &line,
|
||||||
|
int &column ) const
|
||||||
|
{
|
||||||
|
Location current = begin_;
|
||||||
|
Location lastLineStart = current;
|
||||||
|
line = 0;
|
||||||
|
while ( current < location && current != end_ )
|
||||||
|
{
|
||||||
|
Char c = *current++;
|
||||||
|
if ( c == '\r' )
|
||||||
|
{
|
||||||
|
if ( *current == '\n' )
|
||||||
|
++current;
|
||||||
|
lastLineStart = current;
|
||||||
|
++line;
|
||||||
|
}
|
||||||
|
else if ( c == '\n' )
|
||||||
|
{
|
||||||
|
lastLineStart = current;
|
||||||
|
++line;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// column & line start at 1
|
||||||
|
column = int(location - lastLineStart) + 1;
|
||||||
|
++line;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
Reader::getLocationLineAndColumn( Location location ) const
|
||||||
|
{
|
||||||
|
int line, column;
|
||||||
|
getLocationLineAndColumn( location, line, column );
|
||||||
|
char buffer[18+16+16+1];
|
||||||
|
sprintf( buffer, "Line %d, Column %d", line, column );
|
||||||
|
return buffer;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
Reader::getFormatedErrorMessages() const
|
||||||
|
{
|
||||||
|
std::string formattedMessage;
|
||||||
|
for ( Errors::const_iterator itError = errors_.begin();
|
||||||
|
itError != errors_.end();
|
||||||
|
++itError )
|
||||||
|
{
|
||||||
|
const ErrorInfo &error = *itError;
|
||||||
|
formattedMessage += "* " + getLocationLineAndColumn( error.token_.start_ ) + "\n";
|
||||||
|
formattedMessage += " " + error.message_ + "\n";
|
||||||
|
if ( error.extra_ )
|
||||||
|
formattedMessage += "See " + getLocationLineAndColumn( error.extra_ ) + " for detail.\n";
|
||||||
|
}
|
||||||
|
return formattedMessage;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::istream& operator>>( std::istream &sin, Value &root )
|
||||||
|
{
|
||||||
|
Json::Reader reader;
|
||||||
|
bool ok = reader.parse(sin, root, true);
|
||||||
|
//JSON_ASSERT( ok );
|
||||||
|
if (!ok) throw std::runtime_error(reader.getFormatedErrorMessages());
|
||||||
|
return sin;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
} // namespace Json
|
1718
PowerEditor/src/jsoncpp/src/lib_json/json_value.cpp
Normal file
1718
PowerEditor/src/jsoncpp/src/lib_json/json_value.cpp
Normal file
File diff suppressed because it is too large
Load Diff
292
PowerEditor/src/jsoncpp/src/lib_json/json_valueiterator.inl
Normal file
292
PowerEditor/src/jsoncpp/src/lib_json/json_valueiterator.inl
Normal file
@ -0,0 +1,292 @@
|
|||||||
|
// included by json_value.cpp
|
||||||
|
// everything is within Json namespace
|
||||||
|
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueIteratorBase
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
ValueIteratorBase::ValueIteratorBase()
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
: current_()
|
||||||
|
, isNull_( true )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#else
|
||||||
|
: isArray_( true )
|
||||||
|
, isNull_( true )
|
||||||
|
{
|
||||||
|
iterator_.array_ = ValueInternalArray::IteratorState();
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
ValueIteratorBase::ValueIteratorBase( const Value::ObjectValues::iterator ¤t )
|
||||||
|
: current_( current )
|
||||||
|
, isNull_( false )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#else
|
||||||
|
ValueIteratorBase::ValueIteratorBase( const ValueInternalArray::IteratorState &state )
|
||||||
|
: isArray_( true )
|
||||||
|
{
|
||||||
|
iterator_.array_ = state;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueIteratorBase::ValueIteratorBase( const ValueInternalMap::IteratorState &state )
|
||||||
|
: isArray_( false )
|
||||||
|
{
|
||||||
|
iterator_.map_ = state;
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
Value &
|
||||||
|
ValueIteratorBase::deref() const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
return current_->second;
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
return ValueInternalArray::dereference( iterator_.array_ );
|
||||||
|
return ValueInternalMap::value( iterator_.map_ );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueIteratorBase::increment()
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
++current_;
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
ValueInternalArray::increment( iterator_.array_ );
|
||||||
|
ValueInternalMap::increment( iterator_.map_ );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueIteratorBase::decrement()
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
--current_;
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
ValueInternalArray::decrement( iterator_.array_ );
|
||||||
|
ValueInternalMap::decrement( iterator_.map_ );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueIteratorBase::difference_type
|
||||||
|
ValueIteratorBase::computeDistance( const SelfType &other ) const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
# ifdef JSON_USE_CPPTL_SMALLMAP
|
||||||
|
return current_ - other.current_;
|
||||||
|
# else
|
||||||
|
// Iterator for null value are initialized using the default
|
||||||
|
// constructor, which initialize current_ to the default
|
||||||
|
// std::map::iterator. As begin() and end() are two instance
|
||||||
|
// of the default std::map::iterator, they can not be compared.
|
||||||
|
// To allow this, we handle this comparison specifically.
|
||||||
|
if ( isNull_ && other.isNull_ )
|
||||||
|
{
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// Usage of std::distance is not portable (does not compile with Sun Studio 12 RogueWave STL,
|
||||||
|
// which is the one used by default).
|
||||||
|
// Using a portable hand-made version for non random iterator instead:
|
||||||
|
// return difference_type( std::distance( current_, other.current_ ) );
|
||||||
|
difference_type myDistance = 0;
|
||||||
|
for ( Value::ObjectValues::iterator it = current_; it != other.current_; ++it )
|
||||||
|
{
|
||||||
|
++myDistance;
|
||||||
|
}
|
||||||
|
return myDistance;
|
||||||
|
# endif
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
return ValueInternalArray::distance( iterator_.array_, other.iterator_.array_ );
|
||||||
|
return ValueInternalMap::distance( iterator_.map_, other.iterator_.map_ );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
ValueIteratorBase::isEqual( const SelfType &other ) const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
if ( isNull_ )
|
||||||
|
{
|
||||||
|
return other.isNull_;
|
||||||
|
}
|
||||||
|
return current_ == other.current_;
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
return ValueInternalArray::equals( iterator_.array_, other.iterator_.array_ );
|
||||||
|
return ValueInternalMap::equals( iterator_.map_, other.iterator_.map_ );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueIteratorBase::copy( const SelfType &other )
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
current_ = other.current_;
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
iterator_.array_ = other.iterator_.array_;
|
||||||
|
iterator_.map_ = other.iterator_.map_;
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Value
|
||||||
|
ValueIteratorBase::key() const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
const Value::CZString czstring = (*current_).first;
|
||||||
|
if ( czstring.c_str() )
|
||||||
|
{
|
||||||
|
if ( czstring.isStaticString() )
|
||||||
|
return Value( StaticString( czstring.c_str() ) );
|
||||||
|
return Value( czstring.c_str() );
|
||||||
|
}
|
||||||
|
return Value( czstring.index() );
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
return Value( ValueInternalArray::indexOf( iterator_.array_ ) );
|
||||||
|
bool isStatic;
|
||||||
|
const char *memberName = ValueInternalMap::key( iterator_.map_, isStatic );
|
||||||
|
if ( isStatic )
|
||||||
|
return Value( StaticString( memberName ) );
|
||||||
|
return Value( memberName );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
UInt
|
||||||
|
ValueIteratorBase::index() const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
const Value::CZString czstring = (*current_).first;
|
||||||
|
if ( !czstring.c_str() )
|
||||||
|
return czstring.index();
|
||||||
|
return Value::UInt( -1 );
|
||||||
|
#else
|
||||||
|
if ( isArray_ )
|
||||||
|
return Value::UInt( ValueInternalArray::indexOf( iterator_.array_ ) );
|
||||||
|
return Value::UInt( -1 );
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
const char *
|
||||||
|
ValueIteratorBase::memberName() const
|
||||||
|
{
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
const char *name = (*current_).first.c_str();
|
||||||
|
return name ? name : "";
|
||||||
|
#else
|
||||||
|
if ( !isArray_ )
|
||||||
|
return ValueInternalMap::key( iterator_.map_ );
|
||||||
|
return "";
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueConstIterator
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
ValueConstIterator::ValueConstIterator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
ValueConstIterator::ValueConstIterator( const Value::ObjectValues::iterator ¤t )
|
||||||
|
: ValueIteratorBase( current )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#else
|
||||||
|
ValueConstIterator::ValueConstIterator( const ValueInternalArray::IteratorState &state )
|
||||||
|
: ValueIteratorBase( state )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueConstIterator::ValueConstIterator( const ValueInternalMap::IteratorState &state )
|
||||||
|
: ValueIteratorBase( state )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
ValueConstIterator &
|
||||||
|
ValueConstIterator::operator =( const ValueIteratorBase &other )
|
||||||
|
{
|
||||||
|
copy( other );
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// class ValueIterator
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
ValueIterator::ValueIterator()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
#ifndef JSON_VALUE_USE_INTERNAL_MAP
|
||||||
|
ValueIterator::ValueIterator( const Value::ObjectValues::iterator ¤t )
|
||||||
|
: ValueIteratorBase( current )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#else
|
||||||
|
ValueIterator::ValueIterator( const ValueInternalArray::IteratorState &state )
|
||||||
|
: ValueIteratorBase( state )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueIterator::ValueIterator( const ValueInternalMap::IteratorState &state )
|
||||||
|
: ValueIteratorBase( state )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
|
ValueIterator::ValueIterator( const ValueConstIterator &other )
|
||||||
|
: ValueIteratorBase( other )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueIterator::ValueIterator( const ValueIterator &other )
|
||||||
|
: ValueIteratorBase( other )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
ValueIterator &
|
||||||
|
ValueIterator::operator =( const SelfType &other )
|
||||||
|
{
|
||||||
|
copy( other );
|
||||||
|
return *this;
|
||||||
|
}
|
829
PowerEditor/src/jsoncpp/src/lib_json/json_writer.cpp
Normal file
829
PowerEditor/src/jsoncpp/src/lib_json/json_writer.cpp
Normal file
@ -0,0 +1,829 @@
|
|||||||
|
#include <json/writer.h>
|
||||||
|
#include <utility>
|
||||||
|
#include <assert.h>
|
||||||
|
#include <stdio.h>
|
||||||
|
#include <string.h>
|
||||||
|
#include <iostream>
|
||||||
|
#include <sstream>
|
||||||
|
#include <iomanip>
|
||||||
|
|
||||||
|
#if _MSC_VER >= 1400 // VC++ 8.0
|
||||||
|
#pragma warning( disable : 4996 ) // disable warning about strdup being deprecated.
|
||||||
|
#endif
|
||||||
|
|
||||||
|
namespace Json {
|
||||||
|
|
||||||
|
static bool isControlCharacter(char ch)
|
||||||
|
{
|
||||||
|
return ch > 0 && ch <= 0x1F;
|
||||||
|
}
|
||||||
|
|
||||||
|
static bool containsControlCharacter( const char* str )
|
||||||
|
{
|
||||||
|
while ( *str )
|
||||||
|
{
|
||||||
|
if ( isControlCharacter( *(str++) ) )
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
static void uintToString( unsigned int value,
|
||||||
|
char *¤t )
|
||||||
|
{
|
||||||
|
*--current = 0;
|
||||||
|
do
|
||||||
|
{
|
||||||
|
*--current = (value % 10) + '0';
|
||||||
|
value /= 10;
|
||||||
|
}
|
||||||
|
while ( value != 0 );
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string valueToString( Int value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
char *current = buffer + sizeof(buffer);
|
||||||
|
bool isNegative = value < 0;
|
||||||
|
if ( isNegative )
|
||||||
|
value = -value;
|
||||||
|
uintToString( UInt(value), current );
|
||||||
|
if ( isNegative )
|
||||||
|
*--current = '-';
|
||||||
|
assert( current >= buffer );
|
||||||
|
return current;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string valueToString( UInt value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
char *current = buffer + sizeof(buffer);
|
||||||
|
uintToString( value, current );
|
||||||
|
assert( current >= buffer );
|
||||||
|
return current;
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string valueToString( double value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
#if defined(_MSC_VER) && defined(__STDC_SECURE_LIB__) // Use secure version with visual studio 2005 to avoid warning.
|
||||||
|
sprintf_s(buffer, sizeof(buffer), "%#.16g", value);
|
||||||
|
#else
|
||||||
|
sprintf(buffer, "%#.16g", value);
|
||||||
|
#endif
|
||||||
|
char* ch = buffer + strlen(buffer) - 1;
|
||||||
|
if (*ch != '0') return buffer; // nothing to truncate, so save time
|
||||||
|
while(ch > buffer && *ch == '0'){
|
||||||
|
--ch;
|
||||||
|
}
|
||||||
|
char* last_nonzero = ch;
|
||||||
|
while(ch >= buffer){
|
||||||
|
switch(*ch){
|
||||||
|
case '0':
|
||||||
|
case '1':
|
||||||
|
case '2':
|
||||||
|
case '3':
|
||||||
|
case '4':
|
||||||
|
case '5':
|
||||||
|
case '6':
|
||||||
|
case '7':
|
||||||
|
case '8':
|
||||||
|
case '9':
|
||||||
|
--ch;
|
||||||
|
continue;
|
||||||
|
case '.':
|
||||||
|
// Truncate zeroes to save bytes in output, but keep one.
|
||||||
|
*(last_nonzero+2) = '\0';
|
||||||
|
return buffer;
|
||||||
|
default:
|
||||||
|
return buffer;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return buffer;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string valueToString( bool value )
|
||||||
|
{
|
||||||
|
return value ? "true" : "false";
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string valueToQuotedString( const char *value )
|
||||||
|
{
|
||||||
|
// Not sure how to handle unicode...
|
||||||
|
if (strpbrk(value, "\"\\\b\f\n\r\t") == NULL && !containsControlCharacter( value ))
|
||||||
|
return std::string("\"") + value + "\"";
|
||||||
|
// We have to walk value and escape any special characters.
|
||||||
|
// Appending to std::string is not efficient, but this should be rare.
|
||||||
|
// (Note: forward slashes are *not* rare, but I am not escaping them.)
|
||||||
|
unsigned maxsize = strlen(value)*2 + 3; // allescaped+quotes+NULL
|
||||||
|
std::string result;
|
||||||
|
result.reserve(maxsize); // to avoid lots of mallocs
|
||||||
|
result += "\"";
|
||||||
|
for (const char* c=value; *c != 0; ++c)
|
||||||
|
{
|
||||||
|
switch(*c)
|
||||||
|
{
|
||||||
|
case '\"':
|
||||||
|
result += "\\\"";
|
||||||
|
break;
|
||||||
|
case '\\':
|
||||||
|
result += "\\\\";
|
||||||
|
break;
|
||||||
|
case '\b':
|
||||||
|
result += "\\b";
|
||||||
|
break;
|
||||||
|
case '\f':
|
||||||
|
result += "\\f";
|
||||||
|
break;
|
||||||
|
case '\n':
|
||||||
|
result += "\\n";
|
||||||
|
break;
|
||||||
|
case '\r':
|
||||||
|
result += "\\r";
|
||||||
|
break;
|
||||||
|
case '\t':
|
||||||
|
result += "\\t";
|
||||||
|
break;
|
||||||
|
//case '/':
|
||||||
|
// Even though \/ is considered a legal escape in JSON, a bare
|
||||||
|
// slash is also legal, so I see no reason to escape it.
|
||||||
|
// (I hope I am not misunderstanding something.
|
||||||
|
// blep notes: actually escaping \/ may be useful in javascript to avoid </
|
||||||
|
// sequence.
|
||||||
|
// Should add a flag to allow this compatibility mode and prevent this
|
||||||
|
// sequence from occurring.
|
||||||
|
default:
|
||||||
|
if ( isControlCharacter( *c ) )
|
||||||
|
{
|
||||||
|
std::ostringstream oss;
|
||||||
|
oss << "\\u" << std::hex << std::uppercase << std::setfill('0') << std::setw(4) << static_cast<int>(*c);
|
||||||
|
result += oss.str();
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
result += *c;
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result += "\"";
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Class Writer
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
Writer::~Writer()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// Class FastWriter
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
FastWriter::FastWriter()
|
||||||
|
: yamlCompatiblityEnabled_( false )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
FastWriter::enableYAMLCompatibility()
|
||||||
|
{
|
||||||
|
yamlCompatiblityEnabled_ = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
FastWriter::write( const Value &root )
|
||||||
|
{
|
||||||
|
document_ = "";
|
||||||
|
writeValue( root );
|
||||||
|
document_ += "\n";
|
||||||
|
return document_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
FastWriter::writeValue( const Value &value )
|
||||||
|
{
|
||||||
|
switch ( value.type() )
|
||||||
|
{
|
||||||
|
case nullValue:
|
||||||
|
document_ += "null";
|
||||||
|
break;
|
||||||
|
case intValue:
|
||||||
|
document_ += valueToString( value.asInt() );
|
||||||
|
break;
|
||||||
|
case uintValue:
|
||||||
|
document_ += valueToString( value.asUInt() );
|
||||||
|
break;
|
||||||
|
case realValue:
|
||||||
|
document_ += valueToString( value.asDouble() );
|
||||||
|
break;
|
||||||
|
case stringValue:
|
||||||
|
document_ += valueToQuotedString( value.asCString() );
|
||||||
|
break;
|
||||||
|
case booleanValue:
|
||||||
|
document_ += valueToString( value.asBool() );
|
||||||
|
break;
|
||||||
|
case arrayValue:
|
||||||
|
{
|
||||||
|
document_ += "[";
|
||||||
|
int size = value.size();
|
||||||
|
for ( int index =0; index < size; ++index )
|
||||||
|
{
|
||||||
|
if ( index > 0 )
|
||||||
|
document_ += ",";
|
||||||
|
writeValue( value[index] );
|
||||||
|
}
|
||||||
|
document_ += "]";
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case objectValue:
|
||||||
|
{
|
||||||
|
Value::Members members( value.getMemberNames() );
|
||||||
|
document_ += "{";
|
||||||
|
for ( Value::Members::iterator it = members.begin();
|
||||||
|
it != members.end();
|
||||||
|
++it )
|
||||||
|
{
|
||||||
|
const std::string &name = *it;
|
||||||
|
if ( it != members.begin() )
|
||||||
|
document_ += ",";
|
||||||
|
document_ += valueToQuotedString( name.c_str() );
|
||||||
|
document_ += yamlCompatiblityEnabled_ ? ": "
|
||||||
|
: ":";
|
||||||
|
writeValue( value[name] );
|
||||||
|
}
|
||||||
|
document_ += "}";
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// Class StyledWriter
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
StyledWriter::StyledWriter()
|
||||||
|
: rightMargin_( 74 )
|
||||||
|
, indentSize_( 3 )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
StyledWriter::write( const Value &root )
|
||||||
|
{
|
||||||
|
document_ = "";
|
||||||
|
addChildValues_ = false;
|
||||||
|
indentString_ = "";
|
||||||
|
writeCommentBeforeValue( root );
|
||||||
|
writeValue( root );
|
||||||
|
writeCommentAfterValueOnSameLine( root );
|
||||||
|
document_ += "\n";
|
||||||
|
return document_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeValue( const Value &value )
|
||||||
|
{
|
||||||
|
switch ( value.type() )
|
||||||
|
{
|
||||||
|
case nullValue:
|
||||||
|
pushValue( "null" );
|
||||||
|
break;
|
||||||
|
case intValue:
|
||||||
|
pushValue( valueToString( value.asInt() ) );
|
||||||
|
break;
|
||||||
|
case uintValue:
|
||||||
|
pushValue( valueToString( value.asUInt() ) );
|
||||||
|
break;
|
||||||
|
case realValue:
|
||||||
|
pushValue( valueToString( value.asDouble() ) );
|
||||||
|
break;
|
||||||
|
case stringValue:
|
||||||
|
pushValue( valueToQuotedString( value.asCString() ) );
|
||||||
|
break;
|
||||||
|
case booleanValue:
|
||||||
|
pushValue( valueToString( value.asBool() ) );
|
||||||
|
break;
|
||||||
|
case arrayValue:
|
||||||
|
writeArrayValue( value);
|
||||||
|
break;
|
||||||
|
case objectValue:
|
||||||
|
{
|
||||||
|
Value::Members members( value.getMemberNames() );
|
||||||
|
if ( members.empty() )
|
||||||
|
pushValue( "{}" );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
writeWithIndent( "{" );
|
||||||
|
indent();
|
||||||
|
Value::Members::iterator it = members.begin();
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
const std::string &name = *it;
|
||||||
|
const Value &childValue = value[name];
|
||||||
|
writeCommentBeforeValue( childValue );
|
||||||
|
writeWithIndent( valueToQuotedString( name.c_str() ) );
|
||||||
|
document_ += " : ";
|
||||||
|
writeValue( childValue );
|
||||||
|
if ( ++it == members.end() )
|
||||||
|
{
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
document_ += ",";
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
}
|
||||||
|
unindent();
|
||||||
|
writeWithIndent( "}" );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeArrayValue( const Value &value )
|
||||||
|
{
|
||||||
|
unsigned size = value.size();
|
||||||
|
if ( size == 0 )
|
||||||
|
pushValue( "[]" );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
bool isArrayMultiLine = isMultineArray( value );
|
||||||
|
if ( isArrayMultiLine )
|
||||||
|
{
|
||||||
|
writeWithIndent( "[" );
|
||||||
|
indent();
|
||||||
|
bool hasChildValue = !childValues_.empty();
|
||||||
|
unsigned index =0;
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
const Value &childValue = value[index];
|
||||||
|
writeCommentBeforeValue( childValue );
|
||||||
|
if ( hasChildValue )
|
||||||
|
writeWithIndent( childValues_[index] );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
writeIndent();
|
||||||
|
writeValue( childValue );
|
||||||
|
}
|
||||||
|
if ( ++index == size )
|
||||||
|
{
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
document_ += ",";
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
}
|
||||||
|
unindent();
|
||||||
|
writeWithIndent( "]" );
|
||||||
|
}
|
||||||
|
else // output on a single line
|
||||||
|
{
|
||||||
|
assert( childValues_.size() == size );
|
||||||
|
document_ += "[ ";
|
||||||
|
for ( unsigned index =0; index < size; ++index )
|
||||||
|
{
|
||||||
|
if ( index > 0 )
|
||||||
|
document_ += ", ";
|
||||||
|
document_ += childValues_[index];
|
||||||
|
}
|
||||||
|
document_ += " ]";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
StyledWriter::isMultineArray( const Value &value )
|
||||||
|
{
|
||||||
|
int size = value.size();
|
||||||
|
bool isMultiLine = size*3 >= rightMargin_ ;
|
||||||
|
childValues_.clear();
|
||||||
|
for ( int index =0; index < size && !isMultiLine; ++index )
|
||||||
|
{
|
||||||
|
const Value &childValue = value[index];
|
||||||
|
isMultiLine = isMultiLine ||
|
||||||
|
( (childValue.isArray() || childValue.isObject()) &&
|
||||||
|
childValue.size() > 0 );
|
||||||
|
}
|
||||||
|
if ( !isMultiLine ) // check if line length > max line length
|
||||||
|
{
|
||||||
|
childValues_.reserve( size );
|
||||||
|
addChildValues_ = true;
|
||||||
|
int lineLength = 4 + (size-1)*2; // '[ ' + ', '*n + ' ]'
|
||||||
|
for ( int index =0; index < size && !isMultiLine; ++index )
|
||||||
|
{
|
||||||
|
writeValue( value[index] );
|
||||||
|
lineLength += int( childValues_[index].length() );
|
||||||
|
isMultiLine = isMultiLine && hasCommentForValue( value[index] );
|
||||||
|
}
|
||||||
|
addChildValues_ = false;
|
||||||
|
isMultiLine = isMultiLine || lineLength >= rightMargin_;
|
||||||
|
}
|
||||||
|
return isMultiLine;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::pushValue( const std::string &value )
|
||||||
|
{
|
||||||
|
if ( addChildValues_ )
|
||||||
|
childValues_.push_back( value );
|
||||||
|
else
|
||||||
|
document_ += value;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeIndent()
|
||||||
|
{
|
||||||
|
if ( !document_.empty() )
|
||||||
|
{
|
||||||
|
char last = document_[document_.length()-1];
|
||||||
|
if ( last == ' ' ) // already indented
|
||||||
|
return;
|
||||||
|
if ( last != '\n' ) // Comments may add new-line
|
||||||
|
document_ += '\n';
|
||||||
|
}
|
||||||
|
document_ += indentString_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeWithIndent( const std::string &value )
|
||||||
|
{
|
||||||
|
writeIndent();
|
||||||
|
document_ += value;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::indent()
|
||||||
|
{
|
||||||
|
indentString_ += std::string( indentSize_, ' ' );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::unindent()
|
||||||
|
{
|
||||||
|
assert( int(indentString_.size()) >= indentSize_ );
|
||||||
|
indentString_.resize( indentString_.size() - indentSize_ );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeCommentBeforeValue( const Value &root )
|
||||||
|
{
|
||||||
|
if ( !root.hasComment( commentBefore ) )
|
||||||
|
return;
|
||||||
|
document_ += normalizeEOL( root.getComment( commentBefore ) );
|
||||||
|
document_ += "\n";
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledWriter::writeCommentAfterValueOnSameLine( const Value &root )
|
||||||
|
{
|
||||||
|
if ( root.hasComment( commentAfterOnSameLine ) )
|
||||||
|
document_ += " " + normalizeEOL( root.getComment( commentAfterOnSameLine ) );
|
||||||
|
|
||||||
|
if ( root.hasComment( commentAfter ) )
|
||||||
|
{
|
||||||
|
document_ += "\n";
|
||||||
|
document_ += normalizeEOL( root.getComment( commentAfter ) );
|
||||||
|
document_ += "\n";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
StyledWriter::hasCommentForValue( const Value &value )
|
||||||
|
{
|
||||||
|
return value.hasComment( commentBefore )
|
||||||
|
|| value.hasComment( commentAfterOnSameLine )
|
||||||
|
|| value.hasComment( commentAfter );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
StyledWriter::normalizeEOL( const std::string &text )
|
||||||
|
{
|
||||||
|
std::string normalized;
|
||||||
|
normalized.reserve( text.length() );
|
||||||
|
const char *begin = text.c_str();
|
||||||
|
const char *end = begin + text.length();
|
||||||
|
const char *current = begin;
|
||||||
|
while ( current != end )
|
||||||
|
{
|
||||||
|
char c = *current++;
|
||||||
|
if ( c == '\r' ) // mac or dos EOL
|
||||||
|
{
|
||||||
|
if ( *current == '\n' ) // convert dos EOL
|
||||||
|
++current;
|
||||||
|
normalized += '\n';
|
||||||
|
}
|
||||||
|
else // handle unix EOL & other char
|
||||||
|
normalized += c;
|
||||||
|
}
|
||||||
|
return normalized;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// Class StyledStreamWriter
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
StyledStreamWriter::StyledStreamWriter( std::string indentation )
|
||||||
|
: document_(NULL)
|
||||||
|
, rightMargin_( 74 )
|
||||||
|
, indentation_( indentation )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::write( std::ostream &out, const Value &root )
|
||||||
|
{
|
||||||
|
document_ = &out;
|
||||||
|
addChildValues_ = false;
|
||||||
|
indentString_ = "";
|
||||||
|
writeCommentBeforeValue( root );
|
||||||
|
writeValue( root );
|
||||||
|
writeCommentAfterValueOnSameLine( root );
|
||||||
|
*document_ << "\n";
|
||||||
|
document_ = NULL; // Forget the stream, for safety.
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeValue( const Value &value )
|
||||||
|
{
|
||||||
|
switch ( value.type() )
|
||||||
|
{
|
||||||
|
case nullValue:
|
||||||
|
pushValue( "null" );
|
||||||
|
break;
|
||||||
|
case intValue:
|
||||||
|
pushValue( valueToString( value.asInt() ) );
|
||||||
|
break;
|
||||||
|
case uintValue:
|
||||||
|
pushValue( valueToString( value.asUInt() ) );
|
||||||
|
break;
|
||||||
|
case realValue:
|
||||||
|
pushValue( valueToString( value.asDouble() ) );
|
||||||
|
break;
|
||||||
|
case stringValue:
|
||||||
|
pushValue( valueToQuotedString( value.asCString() ) );
|
||||||
|
break;
|
||||||
|
case booleanValue:
|
||||||
|
pushValue( valueToString( value.asBool() ) );
|
||||||
|
break;
|
||||||
|
case arrayValue:
|
||||||
|
writeArrayValue( value);
|
||||||
|
break;
|
||||||
|
case objectValue:
|
||||||
|
{
|
||||||
|
Value::Members members( value.getMemberNames() );
|
||||||
|
if ( members.empty() )
|
||||||
|
pushValue( "{}" );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
writeWithIndent( "{" );
|
||||||
|
indent();
|
||||||
|
Value::Members::iterator it = members.begin();
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
const std::string &name = *it;
|
||||||
|
const Value &childValue = value[name];
|
||||||
|
writeCommentBeforeValue( childValue );
|
||||||
|
writeWithIndent( valueToQuotedString( name.c_str() ) );
|
||||||
|
*document_ << " : ";
|
||||||
|
writeValue( childValue );
|
||||||
|
if ( ++it == members.end() )
|
||||||
|
{
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
*document_ << ",";
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
}
|
||||||
|
unindent();
|
||||||
|
writeWithIndent( "}" );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeArrayValue( const Value &value )
|
||||||
|
{
|
||||||
|
unsigned size = value.size();
|
||||||
|
if ( size == 0 )
|
||||||
|
pushValue( "[]" );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
bool isArrayMultiLine = isMultineArray( value );
|
||||||
|
if ( isArrayMultiLine )
|
||||||
|
{
|
||||||
|
writeWithIndent( "[" );
|
||||||
|
indent();
|
||||||
|
bool hasChildValue = !childValues_.empty();
|
||||||
|
unsigned index =0;
|
||||||
|
while ( true )
|
||||||
|
{
|
||||||
|
const Value &childValue = value[index];
|
||||||
|
writeCommentBeforeValue( childValue );
|
||||||
|
if ( hasChildValue )
|
||||||
|
writeWithIndent( childValues_[index] );
|
||||||
|
else
|
||||||
|
{
|
||||||
|
writeIndent();
|
||||||
|
writeValue( childValue );
|
||||||
|
}
|
||||||
|
if ( ++index == size )
|
||||||
|
{
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
*document_ << ",";
|
||||||
|
writeCommentAfterValueOnSameLine( childValue );
|
||||||
|
}
|
||||||
|
unindent();
|
||||||
|
writeWithIndent( "]" );
|
||||||
|
}
|
||||||
|
else // output on a single line
|
||||||
|
{
|
||||||
|
assert( childValues_.size() == size );
|
||||||
|
*document_ << "[ ";
|
||||||
|
for ( unsigned index =0; index < size; ++index )
|
||||||
|
{
|
||||||
|
if ( index > 0 )
|
||||||
|
*document_ << ", ";
|
||||||
|
*document_ << childValues_[index];
|
||||||
|
}
|
||||||
|
*document_ << " ]";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
StyledStreamWriter::isMultineArray( const Value &value )
|
||||||
|
{
|
||||||
|
int size = value.size();
|
||||||
|
bool isMultiLine = size*3 >= rightMargin_ ;
|
||||||
|
childValues_.clear();
|
||||||
|
for ( int index =0; index < size && !isMultiLine; ++index )
|
||||||
|
{
|
||||||
|
const Value &childValue = value[index];
|
||||||
|
isMultiLine = isMultiLine ||
|
||||||
|
( (childValue.isArray() || childValue.isObject()) &&
|
||||||
|
childValue.size() > 0 );
|
||||||
|
}
|
||||||
|
if ( !isMultiLine ) // check if line length > max line length
|
||||||
|
{
|
||||||
|
childValues_.reserve( size );
|
||||||
|
addChildValues_ = true;
|
||||||
|
int lineLength = 4 + (size-1)*2; // '[ ' + ', '*n + ' ]'
|
||||||
|
for ( int index =0; index < size && !isMultiLine; ++index )
|
||||||
|
{
|
||||||
|
writeValue( value[index] );
|
||||||
|
lineLength += int( childValues_[index].length() );
|
||||||
|
isMultiLine = isMultiLine && hasCommentForValue( value[index] );
|
||||||
|
}
|
||||||
|
addChildValues_ = false;
|
||||||
|
isMultiLine = isMultiLine || lineLength >= rightMargin_;
|
||||||
|
}
|
||||||
|
return isMultiLine;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::pushValue( const std::string &value )
|
||||||
|
{
|
||||||
|
if ( addChildValues_ )
|
||||||
|
childValues_.push_back( value );
|
||||||
|
else
|
||||||
|
*document_ << value;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeIndent()
|
||||||
|
{
|
||||||
|
/*
|
||||||
|
Some comments in this method would have been nice. ;-)
|
||||||
|
|
||||||
|
if ( !document_.empty() )
|
||||||
|
{
|
||||||
|
char last = document_[document_.length()-1];
|
||||||
|
if ( last == ' ' ) // already indented
|
||||||
|
return;
|
||||||
|
if ( last != '\n' ) // Comments may add new-line
|
||||||
|
*document_ << '\n';
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
*document_ << '\n' << indentString_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeWithIndent( const std::string &value )
|
||||||
|
{
|
||||||
|
writeIndent();
|
||||||
|
*document_ << value;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::indent()
|
||||||
|
{
|
||||||
|
indentString_ += indentation_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::unindent()
|
||||||
|
{
|
||||||
|
assert( indentString_.size() >= indentation_.size() );
|
||||||
|
indentString_.resize( indentString_.size() - indentation_.size() );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeCommentBeforeValue( const Value &root )
|
||||||
|
{
|
||||||
|
if ( !root.hasComment( commentBefore ) )
|
||||||
|
return;
|
||||||
|
*document_ << normalizeEOL( root.getComment( commentBefore ) );
|
||||||
|
*document_ << "\n";
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
StyledStreamWriter::writeCommentAfterValueOnSameLine( const Value &root )
|
||||||
|
{
|
||||||
|
if ( root.hasComment( commentAfterOnSameLine ) )
|
||||||
|
*document_ << " " + normalizeEOL( root.getComment( commentAfterOnSameLine ) );
|
||||||
|
|
||||||
|
if ( root.hasComment( commentAfter ) )
|
||||||
|
{
|
||||||
|
*document_ << "\n";
|
||||||
|
*document_ << normalizeEOL( root.getComment( commentAfter ) );
|
||||||
|
*document_ << "\n";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
StyledStreamWriter::hasCommentForValue( const Value &value )
|
||||||
|
{
|
||||||
|
return value.hasComment( commentBefore )
|
||||||
|
|| value.hasComment( commentAfterOnSameLine )
|
||||||
|
|| value.hasComment( commentAfter );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
StyledStreamWriter::normalizeEOL( const std::string &text )
|
||||||
|
{
|
||||||
|
std::string normalized;
|
||||||
|
normalized.reserve( text.length() );
|
||||||
|
const char *begin = text.c_str();
|
||||||
|
const char *end = begin + text.length();
|
||||||
|
const char *current = begin;
|
||||||
|
while ( current != end )
|
||||||
|
{
|
||||||
|
char c = *current++;
|
||||||
|
if ( c == '\r' ) // mac or dos EOL
|
||||||
|
{
|
||||||
|
if ( *current == '\n' ) // convert dos EOL
|
||||||
|
++current;
|
||||||
|
normalized += '\n';
|
||||||
|
}
|
||||||
|
else // handle unix EOL & other char
|
||||||
|
normalized += c;
|
||||||
|
}
|
||||||
|
return normalized;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::ostream& operator<<( std::ostream &sout, const Value &root )
|
||||||
|
{
|
||||||
|
Json::StyledStreamWriter writer;
|
||||||
|
writer.write(sout, root);
|
||||||
|
return sout;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
} // namespace Json
|
8
PowerEditor/src/jsoncpp/src/lib_json/sconscript
Normal file
8
PowerEditor/src/jsoncpp/src/lib_json/sconscript
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
Import( 'env buildLibrary' )
|
||||||
|
|
||||||
|
buildLibrary( env, Split( """
|
||||||
|
json_reader.cpp
|
||||||
|
json_value.cpp
|
||||||
|
json_writer.cpp
|
||||||
|
""" ),
|
||||||
|
'json' )
|
603
PowerEditor/src/jsoncpp/src/test_lib_json/jsontest.cpp
Normal file
603
PowerEditor/src/jsoncpp/src/test_lib_json/jsontest.cpp
Normal file
@ -0,0 +1,603 @@
|
|||||||
|
#define _CRT_SECURE_NO_WARNINGS 1 // Prevents deprecation warning with MSVC
|
||||||
|
#include "jsontest.h"
|
||||||
|
#include <stdio.h>
|
||||||
|
#include <string>
|
||||||
|
|
||||||
|
#if defined(_MSC_VER)
|
||||||
|
// Used to install a report hook that prevent dialog on assertion and error.
|
||||||
|
# include <crtdbg.h>
|
||||||
|
#endif // if defined(_MSC_VER)
|
||||||
|
|
||||||
|
#if defined(_WIN32)
|
||||||
|
// Used to prevent dialog on memory fault.
|
||||||
|
// Limits headers included by Windows.h
|
||||||
|
# define WIN32_LEAN_AND_MEAN
|
||||||
|
# define NOSERVICE
|
||||||
|
# define NOMCX
|
||||||
|
# define NOIME
|
||||||
|
# define NOSOUND
|
||||||
|
# define NOCOMM
|
||||||
|
# define NORPC
|
||||||
|
# define NOGDI
|
||||||
|
# define NOUSER
|
||||||
|
# define NODRIVERS
|
||||||
|
# define NOLOGERROR
|
||||||
|
# define NOPROFILER
|
||||||
|
# define NOMEMMGR
|
||||||
|
# define NOLFILEIO
|
||||||
|
# define NOOPENFILE
|
||||||
|
# define NORESOURCE
|
||||||
|
# define NOATOM
|
||||||
|
# define NOLANGUAGE
|
||||||
|
# define NOLSTRING
|
||||||
|
# define NODBCS
|
||||||
|
# define NOKEYBOARDINFO
|
||||||
|
# define NOGDICAPMASKS
|
||||||
|
# define NOCOLOR
|
||||||
|
# define NOGDIOBJ
|
||||||
|
# define NODRAWTEXT
|
||||||
|
# define NOTEXTMETRIC
|
||||||
|
# define NOSCALABLEFONT
|
||||||
|
# define NOBITMAP
|
||||||
|
# define NORASTEROPS
|
||||||
|
# define NOMETAFILE
|
||||||
|
# define NOSYSMETRICS
|
||||||
|
# define NOSYSTEMPARAMSINFO
|
||||||
|
# define NOMSG
|
||||||
|
# define NOWINSTYLES
|
||||||
|
# define NOWINOFFSETS
|
||||||
|
# define NOSHOWWINDOW
|
||||||
|
# define NODEFERWINDOWPOS
|
||||||
|
# define NOVIRTUALKEYCODES
|
||||||
|
# define NOKEYSTATES
|
||||||
|
# define NOWH
|
||||||
|
# define NOMENUS
|
||||||
|
# define NOSCROLL
|
||||||
|
# define NOCLIPBOARD
|
||||||
|
# define NOICONS
|
||||||
|
# define NOMB
|
||||||
|
# define NOSYSCOMMANDS
|
||||||
|
# define NOMDI
|
||||||
|
# define NOCTLMGR
|
||||||
|
# define NOWINMESSAGES
|
||||||
|
# include <windows.h>
|
||||||
|
#endif // if defined(_WIN32)
|
||||||
|
|
||||||
|
namespace JsonTest {
|
||||||
|
|
||||||
|
|
||||||
|
// class TestResult
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
TestResult::TestResult()
|
||||||
|
: predicateId_( 1 )
|
||||||
|
, lastUsedPredicateId_( 0 )
|
||||||
|
, messageTarget_( 0 )
|
||||||
|
{
|
||||||
|
// The root predicate has id 0
|
||||||
|
rootPredicateNode_.id_ = 0;
|
||||||
|
rootPredicateNode_.next_ = 0;
|
||||||
|
predicateStackTail_ = &rootPredicateNode_;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
TestResult::setTestName( const std::string &name )
|
||||||
|
{
|
||||||
|
name_ = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::addFailure( const char *file, unsigned int line,
|
||||||
|
const char *expr )
|
||||||
|
{
|
||||||
|
/// Walks the PredicateContext stack adding them to failures_ if not already added.
|
||||||
|
unsigned int nestingLevel = 0;
|
||||||
|
PredicateContext *lastNode = rootPredicateNode_.next_;
|
||||||
|
for ( ; lastNode != 0; lastNode = lastNode->next_ )
|
||||||
|
{
|
||||||
|
if ( lastNode->id_ > lastUsedPredicateId_ ) // new PredicateContext
|
||||||
|
{
|
||||||
|
lastUsedPredicateId_ = lastNode->id_;
|
||||||
|
addFailureInfo( lastNode->file_, lastNode->line_, lastNode->expr_,
|
||||||
|
nestingLevel );
|
||||||
|
// Link the PredicateContext to the failure for message target when
|
||||||
|
// popping the PredicateContext.
|
||||||
|
lastNode->failure_ = &( failures_.back() );
|
||||||
|
}
|
||||||
|
++nestingLevel;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Adds the failed assertion
|
||||||
|
addFailureInfo( file, line, expr, nestingLevel );
|
||||||
|
messageTarget_ = &( failures_.back() );
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
TestResult::addFailureInfo( const char *file, unsigned int line,
|
||||||
|
const char *expr, unsigned int nestingLevel )
|
||||||
|
{
|
||||||
|
Failure failure;
|
||||||
|
failure.file_ = file;
|
||||||
|
failure.line_ = line;
|
||||||
|
if ( expr )
|
||||||
|
{
|
||||||
|
failure.expr_ = expr;
|
||||||
|
}
|
||||||
|
failure.nestingLevel_ = nestingLevel;
|
||||||
|
failures_.push_back( failure );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::popPredicateContext()
|
||||||
|
{
|
||||||
|
PredicateContext *lastNode = &rootPredicateNode_;
|
||||||
|
while ( lastNode->next_ != 0 && lastNode->next_->next_ != 0 )
|
||||||
|
{
|
||||||
|
lastNode = lastNode->next_;
|
||||||
|
}
|
||||||
|
// Set message target to popped failure
|
||||||
|
PredicateContext *tail = lastNode->next_;
|
||||||
|
if ( tail != 0 && tail->failure_ != 0 )
|
||||||
|
{
|
||||||
|
messageTarget_ = tail->failure_;
|
||||||
|
}
|
||||||
|
// Remove tail from list
|
||||||
|
predicateStackTail_ = lastNode;
|
||||||
|
lastNode->next_ = 0;
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
TestResult::failed() const
|
||||||
|
{
|
||||||
|
return !failures_.empty();
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
unsigned int
|
||||||
|
TestResult::getAssertionNestingLevel() const
|
||||||
|
{
|
||||||
|
unsigned int level = 0;
|
||||||
|
const PredicateContext *lastNode = &rootPredicateNode_;
|
||||||
|
while ( lastNode->next_ != 0 )
|
||||||
|
{
|
||||||
|
lastNode = lastNode->next_;
|
||||||
|
++level;
|
||||||
|
}
|
||||||
|
return level;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
TestResult::printFailure( bool printTestName ) const
|
||||||
|
{
|
||||||
|
if ( failures_.empty() )
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( printTestName )
|
||||||
|
{
|
||||||
|
printf( "* Detail of %s test failure:\n", name_.c_str() );
|
||||||
|
}
|
||||||
|
|
||||||
|
// Print in reverse to display the callstack in the right order
|
||||||
|
Failures::const_iterator itEnd = failures_.end();
|
||||||
|
for ( Failures::const_iterator it = failures_.begin(); it != itEnd; ++it )
|
||||||
|
{
|
||||||
|
const Failure &failure = *it;
|
||||||
|
std::string indent( failure.nestingLevel_ * 2, ' ' );
|
||||||
|
if ( failure.file_ )
|
||||||
|
{
|
||||||
|
printf( "%s%s(%d): ", indent.c_str(), failure.file_, failure.line_ );
|
||||||
|
}
|
||||||
|
if ( !failure.expr_.empty() )
|
||||||
|
{
|
||||||
|
printf( "%s\n", failure.expr_.c_str() );
|
||||||
|
}
|
||||||
|
else if ( failure.file_ )
|
||||||
|
{
|
||||||
|
printf( "\n" );
|
||||||
|
}
|
||||||
|
if ( !failure.message_.empty() )
|
||||||
|
{
|
||||||
|
std::string reindented = indentText( failure.message_, indent + " " );
|
||||||
|
printf( "%s\n", reindented.c_str() );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
TestResult::indentText( const std::string &text,
|
||||||
|
const std::string &indent )
|
||||||
|
{
|
||||||
|
std::string reindented;
|
||||||
|
std::string::size_type lastIndex = 0;
|
||||||
|
while ( lastIndex < text.size() )
|
||||||
|
{
|
||||||
|
std::string::size_type nextIndex = text.find( '\n', lastIndex );
|
||||||
|
if ( nextIndex == std::string::npos )
|
||||||
|
{
|
||||||
|
nextIndex = text.size() - 1;
|
||||||
|
}
|
||||||
|
reindented += indent;
|
||||||
|
reindented += text.substr( lastIndex, nextIndex - lastIndex + 1 );
|
||||||
|
lastIndex = nextIndex + 1;
|
||||||
|
}
|
||||||
|
return reindented;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::addToLastFailure( const std::string &message )
|
||||||
|
{
|
||||||
|
if ( messageTarget_ != 0 )
|
||||||
|
{
|
||||||
|
messageTarget_->message_ += message;
|
||||||
|
}
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( bool value )
|
||||||
|
{
|
||||||
|
return addToLastFailure( value ? "true" : "false" );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( int value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
sprintf( buffer, "%d", value );
|
||||||
|
return addToLastFailure( buffer );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( unsigned int value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
sprintf( buffer, "%u", value );
|
||||||
|
return addToLastFailure( buffer );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( double value )
|
||||||
|
{
|
||||||
|
char buffer[32];
|
||||||
|
sprintf( buffer, "%16g", value );
|
||||||
|
return addToLastFailure( buffer );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( const char *value )
|
||||||
|
{
|
||||||
|
return addToLastFailure( value ? value
|
||||||
|
: "<NULL>" );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
TestResult::operator << ( const std::string &value )
|
||||||
|
{
|
||||||
|
return addToLastFailure( value );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
// class TestCase
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
TestCase::TestCase()
|
||||||
|
: result_( 0 )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
TestCase::~TestCase()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
TestCase::run( TestResult &result )
|
||||||
|
{
|
||||||
|
result_ = &result;
|
||||||
|
runTestCase();
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
// class Runner
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
Runner::Runner()
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Runner &
|
||||||
|
Runner::add( TestCaseFactory factory )
|
||||||
|
{
|
||||||
|
tests_.push_back( factory );
|
||||||
|
return *this;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
unsigned int
|
||||||
|
Runner::testCount() const
|
||||||
|
{
|
||||||
|
return static_cast<unsigned int>( tests_.size() );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::string
|
||||||
|
Runner::testNameAt( unsigned int index ) const
|
||||||
|
{
|
||||||
|
TestCase *test = tests_[index]();
|
||||||
|
std::string name = test->testName();
|
||||||
|
delete test;
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Runner::runTestAt( unsigned int index, TestResult &result ) const
|
||||||
|
{
|
||||||
|
TestCase *test = tests_[index]();
|
||||||
|
result.setTestName( test->testName() );
|
||||||
|
printf( "Testing %s: ", test->testName() );
|
||||||
|
fflush( stdout );
|
||||||
|
#if JSON_USE_EXCEPTION
|
||||||
|
try
|
||||||
|
{
|
||||||
|
#endif // if JSON_USE_EXCEPTION
|
||||||
|
test->run( result );
|
||||||
|
#if JSON_USE_EXCEPTION
|
||||||
|
}
|
||||||
|
catch ( const std::exception &e )
|
||||||
|
{
|
||||||
|
result.addFailure( __FILE__, __LINE__,
|
||||||
|
"Unexpected exception caugth:" ) << e.what();
|
||||||
|
}
|
||||||
|
#endif // if JSON_USE_EXCEPTION
|
||||||
|
delete test;
|
||||||
|
const char *status = result.failed() ? "FAILED"
|
||||||
|
: "OK";
|
||||||
|
printf( "%s\n", status );
|
||||||
|
fflush( stdout );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Runner::runAllTest( bool printSummary ) const
|
||||||
|
{
|
||||||
|
unsigned int count = testCount();
|
||||||
|
std::deque<TestResult> failures;
|
||||||
|
for ( unsigned int index = 0; index < count; ++index )
|
||||||
|
{
|
||||||
|
TestResult result;
|
||||||
|
runTestAt( index, result );
|
||||||
|
if ( result.failed() )
|
||||||
|
{
|
||||||
|
failures.push_back( result );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( failures.empty() )
|
||||||
|
{
|
||||||
|
if ( printSummary )
|
||||||
|
{
|
||||||
|
printf( "All %d tests passed\n", count );
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
for ( unsigned int index = 0; index < failures.size(); ++index )
|
||||||
|
{
|
||||||
|
TestResult &result = failures[index];
|
||||||
|
result.printFailure( count > 1 );
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( printSummary )
|
||||||
|
{
|
||||||
|
unsigned int failedCount = static_cast<unsigned int>( failures.size() );
|
||||||
|
unsigned int passedCount = count - failedCount;
|
||||||
|
printf( "%d/%d tests passed (%d failure(s))\n", passedCount, count, failedCount );
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
bool
|
||||||
|
Runner::testIndex( const std::string &testName,
|
||||||
|
unsigned int &indexOut ) const
|
||||||
|
{
|
||||||
|
unsigned int count = testCount();
|
||||||
|
for ( unsigned int index = 0; index < count; ++index )
|
||||||
|
{
|
||||||
|
if ( testNameAt(index) == testName )
|
||||||
|
{
|
||||||
|
indexOut = index;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Runner::listTests() const
|
||||||
|
{
|
||||||
|
unsigned int count = testCount();
|
||||||
|
for ( unsigned int index = 0; index < count; ++index )
|
||||||
|
{
|
||||||
|
printf( "%s\n", testNameAt( index ).c_str() );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
int
|
||||||
|
Runner::runCommandLine( int argc, const char *argv[] ) const
|
||||||
|
{
|
||||||
|
typedef std::deque<std::string> TestNames;
|
||||||
|
Runner subrunner;
|
||||||
|
for ( int index = 1; index < argc; ++index )
|
||||||
|
{
|
||||||
|
std::string opt = argv[index];
|
||||||
|
if ( opt == "--list-tests" )
|
||||||
|
{
|
||||||
|
listTests();
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
else if ( opt == "--test-auto" )
|
||||||
|
{
|
||||||
|
preventDialogOnCrash();
|
||||||
|
}
|
||||||
|
else if ( opt == "--test" )
|
||||||
|
{
|
||||||
|
++index;
|
||||||
|
if ( index < argc )
|
||||||
|
{
|
||||||
|
unsigned int testNameIndex;
|
||||||
|
if ( testIndex( argv[index], testNameIndex ) )
|
||||||
|
{
|
||||||
|
subrunner.add( tests_[testNameIndex] );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
fprintf( stderr, "Test '%s' does not exist!\n", argv[index] );
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
printUsage( argv[0] );
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
printUsage( argv[0] );
|
||||||
|
return 2;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
bool succeeded;
|
||||||
|
if ( subrunner.testCount() > 0 )
|
||||||
|
{
|
||||||
|
succeeded = subrunner.runAllTest( subrunner.testCount() > 1 );
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
succeeded = runAllTest( true );
|
||||||
|
}
|
||||||
|
return succeeded ? 0
|
||||||
|
: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
#if defined(_MSC_VER)
|
||||||
|
// Hook MSVCRT assertions to prevent dialog from appearing
|
||||||
|
static int
|
||||||
|
msvcrtSilentReportHook( int reportType, char *message, int *returnValue )
|
||||||
|
{
|
||||||
|
// The default CRT handling of error and assertion is to display
|
||||||
|
// an error dialog to the user.
|
||||||
|
// Instead, when an error or an assertion occurs, we force the
|
||||||
|
// application to terminate using abort() after display
|
||||||
|
// the message on stderr.
|
||||||
|
if ( reportType == _CRT_ERROR ||
|
||||||
|
reportType == _CRT_ASSERT )
|
||||||
|
{
|
||||||
|
// calling abort() cause the ReportHook to be called
|
||||||
|
// The following is used to detect this case and let's the
|
||||||
|
// error handler fallback on its default behaviour (
|
||||||
|
// display a warning message)
|
||||||
|
static volatile bool isAborting = false;
|
||||||
|
if ( isAborting )
|
||||||
|
{
|
||||||
|
return TRUE;
|
||||||
|
}
|
||||||
|
isAborting = true;
|
||||||
|
|
||||||
|
fprintf( stderr, "CRT Error/Assert:\n%s\n", message );
|
||||||
|
fflush( stderr );
|
||||||
|
abort();
|
||||||
|
}
|
||||||
|
// Let's other reportType (_CRT_WARNING) be handled as they would by default
|
||||||
|
return FALSE;
|
||||||
|
}
|
||||||
|
#endif // if defined(_MSC_VER)
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
Runner::preventDialogOnCrash()
|
||||||
|
{
|
||||||
|
#if defined(_MSC_VER)
|
||||||
|
// Install a hook to prevent MSVCRT error and assertion from
|
||||||
|
// popping a dialog.
|
||||||
|
_CrtSetReportHook( &msvcrtSilentReportHook );
|
||||||
|
#endif // if defined(_MSC_VER)
|
||||||
|
|
||||||
|
// @todo investiguate this handler (for buffer overflow)
|
||||||
|
// _set_security_error_handler
|
||||||
|
|
||||||
|
#if defined(_WIN32)
|
||||||
|
// Prevents the system from popping a dialog for debugging if the
|
||||||
|
// application fails due to invalid memory access.
|
||||||
|
SetErrorMode( SEM_FAILCRITICALERRORS
|
||||||
|
| SEM_NOGPFAULTERRORBOX
|
||||||
|
| SEM_NOOPENFILEERRORBOX );
|
||||||
|
#endif // if defined(_WIN32)
|
||||||
|
}
|
||||||
|
|
||||||
|
void
|
||||||
|
Runner::printUsage( const char *appName )
|
||||||
|
{
|
||||||
|
printf(
|
||||||
|
"Usage: %s [options]\n"
|
||||||
|
"\n"
|
||||||
|
"If --test is not specified, then all the test cases be run.\n"
|
||||||
|
"\n"
|
||||||
|
"Valid options:\n"
|
||||||
|
"--list-tests: print the name of all test cases on the standard\n"
|
||||||
|
" output and exit.\n"
|
||||||
|
"--test TESTNAME: executes the test case with the specified name.\n"
|
||||||
|
" May be repeated.\n"
|
||||||
|
"--test-auto: prevent dialog prompting for debugging on crash.\n"
|
||||||
|
, appName );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
// Assertion functions
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
checkStringEqual( TestResult &result,
|
||||||
|
const std::string &expected, const std::string &actual,
|
||||||
|
const char *file, unsigned int line, const char *expr )
|
||||||
|
{
|
||||||
|
if ( expected != actual )
|
||||||
|
{
|
||||||
|
result.addFailure( file, line, expr );
|
||||||
|
result << "Expected: '" << expected << "'\n";
|
||||||
|
result << "Actual : '" << actual << "'";
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
} // namespace JsonTest
|
254
PowerEditor/src/jsoncpp/src/test_lib_json/jsontest.h
Normal file
254
PowerEditor/src/jsoncpp/src/test_lib_json/jsontest.h
Normal file
@ -0,0 +1,254 @@
|
|||||||
|
#ifndef JSONTEST_H_INCLUDED
|
||||||
|
# define JSONTEST_H_INCLUDED
|
||||||
|
|
||||||
|
# include <json/config.h>
|
||||||
|
# include <stdio.h>
|
||||||
|
# include <deque>
|
||||||
|
# include <string>
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// Mini Unit Testing framework
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
/** \brief Unit testing framework.
|
||||||
|
* \warning: all assertions are non-aborting, test case execution will continue
|
||||||
|
* even if an assertion namespace.
|
||||||
|
* This constraint is for portability: the framework needs to compile
|
||||||
|
* on Visual Studio 6 and must not require exception usage.
|
||||||
|
*/
|
||||||
|
namespace JsonTest {
|
||||||
|
|
||||||
|
|
||||||
|
class Failure
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
const char *file_;
|
||||||
|
unsigned int line_;
|
||||||
|
std::string expr_;
|
||||||
|
std::string message_;
|
||||||
|
unsigned int nestingLevel_;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
/// Context used to create the assertion callstack on failure.
|
||||||
|
/// Must be a POD to allow inline initialisation without stepping
|
||||||
|
/// into the debugger.
|
||||||
|
struct PredicateContext
|
||||||
|
{
|
||||||
|
typedef unsigned int Id;
|
||||||
|
Id id_;
|
||||||
|
const char *file_;
|
||||||
|
unsigned int line_;
|
||||||
|
const char *expr_;
|
||||||
|
PredicateContext *next_;
|
||||||
|
/// Related Failure, set when the PredicateContext is converted
|
||||||
|
/// into a Failure.
|
||||||
|
Failure *failure_;
|
||||||
|
};
|
||||||
|
|
||||||
|
class TestResult
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
TestResult();
|
||||||
|
|
||||||
|
/// \internal Implementation detail for assertion macros
|
||||||
|
/// Not encapsulated to prevent step into when debugging failed assertions
|
||||||
|
/// Incremented by one on assertion predicate entry, decreased by one
|
||||||
|
/// by addPredicateContext().
|
||||||
|
PredicateContext::Id predicateId_;
|
||||||
|
|
||||||
|
/// \internal Implementation detail for predicate macros
|
||||||
|
PredicateContext *predicateStackTail_;
|
||||||
|
|
||||||
|
void setTestName( const std::string &name );
|
||||||
|
|
||||||
|
/// Adds an assertion failure.
|
||||||
|
TestResult &addFailure( const char *file, unsigned int line,
|
||||||
|
const char *expr = 0 );
|
||||||
|
|
||||||
|
/// Removes the last PredicateContext added to the predicate stack
|
||||||
|
/// chained list.
|
||||||
|
/// Next messages will be targed at the PredicateContext that was removed.
|
||||||
|
TestResult &popPredicateContext();
|
||||||
|
|
||||||
|
bool failed() const;
|
||||||
|
|
||||||
|
void printFailure( bool printTestName ) const;
|
||||||
|
|
||||||
|
TestResult &operator << ( bool value );
|
||||||
|
TestResult &operator << ( int value );
|
||||||
|
TestResult &operator << ( unsigned int value );
|
||||||
|
TestResult &operator << ( double value );
|
||||||
|
TestResult &operator << ( const char *value );
|
||||||
|
TestResult &operator << ( const std::string &value );
|
||||||
|
|
||||||
|
private:
|
||||||
|
TestResult &addToLastFailure( const std::string &message );
|
||||||
|
unsigned int getAssertionNestingLevel() const;
|
||||||
|
/// Adds a failure or a predicate context
|
||||||
|
void addFailureInfo( const char *file, unsigned int line,
|
||||||
|
const char *expr, unsigned int nestingLevel );
|
||||||
|
static std::string indentText( const std::string &text,
|
||||||
|
const std::string &indent );
|
||||||
|
|
||||||
|
typedef std::deque<Failure> Failures;
|
||||||
|
Failures failures_;
|
||||||
|
std::string name_;
|
||||||
|
PredicateContext rootPredicateNode_;
|
||||||
|
PredicateContext::Id lastUsedPredicateId_;
|
||||||
|
/// Failure which is the target of the messages added using operator <<
|
||||||
|
Failure *messageTarget_;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
class TestCase
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
TestCase();
|
||||||
|
|
||||||
|
virtual ~TestCase();
|
||||||
|
|
||||||
|
void run( TestResult &result );
|
||||||
|
|
||||||
|
virtual const char *testName() const = 0;
|
||||||
|
|
||||||
|
protected:
|
||||||
|
TestResult *result_;
|
||||||
|
|
||||||
|
private:
|
||||||
|
virtual void runTestCase() = 0;
|
||||||
|
};
|
||||||
|
|
||||||
|
/// Function pointer type for TestCase factory
|
||||||
|
typedef TestCase *(*TestCaseFactory)();
|
||||||
|
|
||||||
|
class Runner
|
||||||
|
{
|
||||||
|
public:
|
||||||
|
Runner();
|
||||||
|
|
||||||
|
/// Adds a test to the suite
|
||||||
|
Runner &add( TestCaseFactory factory );
|
||||||
|
|
||||||
|
/// Runs test as specified on the command-line
|
||||||
|
/// If no command-line arguments are provided, run all tests.
|
||||||
|
/// If --list-tests is provided, then print the list of all test cases
|
||||||
|
/// If --test <testname> is provided, then run test testname.
|
||||||
|
int runCommandLine( int argc, const char *argv[] ) const;
|
||||||
|
|
||||||
|
/// Runs all the test cases
|
||||||
|
bool runAllTest( bool printSummary ) const;
|
||||||
|
|
||||||
|
/// Returns the number of test case in the suite
|
||||||
|
unsigned int testCount() const;
|
||||||
|
|
||||||
|
/// Returns the name of the test case at the specified index
|
||||||
|
std::string testNameAt( unsigned int index ) const;
|
||||||
|
|
||||||
|
/// Runs the test case at the specified index using the specified TestResult
|
||||||
|
void runTestAt( unsigned int index, TestResult &result ) const;
|
||||||
|
|
||||||
|
static void printUsage( const char *appName );
|
||||||
|
|
||||||
|
private: // prevents copy construction and assignment
|
||||||
|
Runner( const Runner &other );
|
||||||
|
Runner &operator =( const Runner &other );
|
||||||
|
|
||||||
|
private:
|
||||||
|
void listTests() const;
|
||||||
|
bool testIndex( const std::string &testName, unsigned int &index ) const;
|
||||||
|
static void preventDialogOnCrash();
|
||||||
|
|
||||||
|
private:
|
||||||
|
typedef std::deque<TestCaseFactory> Factories;
|
||||||
|
Factories tests_;
|
||||||
|
};
|
||||||
|
|
||||||
|
template<typename T>
|
||||||
|
TestResult &
|
||||||
|
checkEqual( TestResult &result, const T &expected, const T &actual,
|
||||||
|
const char *file, unsigned int line, const char *expr )
|
||||||
|
{
|
||||||
|
if ( expected != actual )
|
||||||
|
{
|
||||||
|
result.addFailure( file, line, expr );
|
||||||
|
result << "Expected: " << expected << "\n";
|
||||||
|
result << "Actual : " << actual;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
TestResult &
|
||||||
|
checkStringEqual( TestResult &result,
|
||||||
|
const std::string &expected, const std::string &actual,
|
||||||
|
const char *file, unsigned int line, const char *expr );
|
||||||
|
|
||||||
|
} // namespace JsonTest
|
||||||
|
|
||||||
|
|
||||||
|
/// \brief Asserts that the given expression is true.
|
||||||
|
/// JSONTEST_ASSERT( x == y ) << "x=" << x << ", y=" << y;
|
||||||
|
/// JSONTEST_ASSERT( x == y );
|
||||||
|
#define JSONTEST_ASSERT( expr ) \
|
||||||
|
if ( condition ) \
|
||||||
|
{ \
|
||||||
|
} \
|
||||||
|
else \
|
||||||
|
result_->addFailure( __FILE__, __LINE__, #expr )
|
||||||
|
|
||||||
|
/// \brief Asserts that the given predicate is true.
|
||||||
|
/// The predicate may do other assertions and be a member function of the fixture.
|
||||||
|
#define JSONTEST_ASSERT_PRED( expr ) \
|
||||||
|
{ \
|
||||||
|
JsonTest::PredicateContext _minitest_Context = { \
|
||||||
|
result_->predicateId_, __FILE__, __LINE__, #expr }; \
|
||||||
|
result_->predicateStackTail_->next_ = &_minitest_Context; \
|
||||||
|
result_->predicateId_ += 1; \
|
||||||
|
result_->predicateStackTail_ = &_minitest_Context; \
|
||||||
|
(expr); \
|
||||||
|
result_->popPredicateContext(); \
|
||||||
|
} \
|
||||||
|
*result_
|
||||||
|
|
||||||
|
/// \brief Asserts that two values are equals.
|
||||||
|
#define JSONTEST_ASSERT_EQUAL( expected, actual ) \
|
||||||
|
JsonTest::checkEqual( *result_, expected, actual, \
|
||||||
|
__FILE__, __LINE__, \
|
||||||
|
#expected " == " #actual )
|
||||||
|
|
||||||
|
/// \brief Asserts that two values are equals.
|
||||||
|
#define JSONTEST_ASSERT_STRING_EQUAL( expected, actual ) \
|
||||||
|
JsonTest::checkStringEqual( *result_, \
|
||||||
|
std::string(expected), std::string(actual), \
|
||||||
|
#expected " == " #actual )
|
||||||
|
|
||||||
|
/// \brief Begin a fixture test case.
|
||||||
|
#define JSONTEST_FIXTURE( FixtureType, name ) \
|
||||||
|
class Test##FixtureType##name : public FixtureType \
|
||||||
|
{ \
|
||||||
|
public: \
|
||||||
|
static JsonTest::TestCase *factory() \
|
||||||
|
{ \
|
||||||
|
return new Test##FixtureType##name(); \
|
||||||
|
} \
|
||||||
|
public: /* overidden from TestCase */ \
|
||||||
|
virtual const char *testName() const \
|
||||||
|
{ \
|
||||||
|
return #FixtureType "/" #name; \
|
||||||
|
} \
|
||||||
|
virtual void runTestCase(); \
|
||||||
|
}; \
|
||||||
|
\
|
||||||
|
void Test##FixtureType##name::runTestCase()
|
||||||
|
|
||||||
|
#define JSONTEST_FIXTURE_FACTORY( FixtureType, name ) \
|
||||||
|
&Test##FixtureType##name::factory
|
||||||
|
|
||||||
|
#define JSONTEST_REGISTER_FIXTURE( runner, FixtureType, name ) \
|
||||||
|
(runner).add( JSONTEST_FIXTURE_FACTORY( FixtureType, name ) )
|
||||||
|
|
||||||
|
#endif // ifndef JSONTEST_H_INCLUDED
|
244
PowerEditor/src/jsoncpp/src/test_lib_json/main.cpp
Normal file
244
PowerEditor/src/jsoncpp/src/test_lib_json/main.cpp
Normal file
@ -0,0 +1,244 @@
|
|||||||
|
#include <json/json.h>
|
||||||
|
#include "jsontest.h"
|
||||||
|
|
||||||
|
|
||||||
|
// TODO:
|
||||||
|
// - boolean value returns that they are integral. Should not be.
|
||||||
|
// - unsigned integer in integer range are not considered to be valid integer. Should check range.
|
||||||
|
|
||||||
|
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// Json Library test cases
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
// //////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
|
|
||||||
|
struct ValueTest : JsonTest::TestCase
|
||||||
|
{
|
||||||
|
Json::Value null_;
|
||||||
|
Json::Value emptyArray_;
|
||||||
|
Json::Value emptyObject_;
|
||||||
|
Json::Value integer_;
|
||||||
|
Json::Value unsignedInteger_;
|
||||||
|
Json::Value smallUnsignedInteger_;
|
||||||
|
Json::Value real_;
|
||||||
|
Json::Value array1_;
|
||||||
|
Json::Value object1_;
|
||||||
|
Json::Value emptyString_;
|
||||||
|
Json::Value string1_;
|
||||||
|
Json::Value string_;
|
||||||
|
Json::Value true_;
|
||||||
|
Json::Value false_;
|
||||||
|
|
||||||
|
ValueTest()
|
||||||
|
: emptyArray_( Json::arrayValue )
|
||||||
|
, emptyObject_( Json::objectValue )
|
||||||
|
, integer_( 123456789 )
|
||||||
|
, smallUnsignedInteger_( Json::Value::UInt( Json::Value::maxInt ) )
|
||||||
|
, unsignedInteger_( 34567890u )
|
||||||
|
, real_( 1234.56789 )
|
||||||
|
, emptyString_( "" )
|
||||||
|
, string1_( "a" )
|
||||||
|
, string_( "sometext with space" )
|
||||||
|
, true_( true )
|
||||||
|
, false_( false )
|
||||||
|
{
|
||||||
|
array1_.append( 1234 );
|
||||||
|
object1_["id"] = 1234;
|
||||||
|
}
|
||||||
|
|
||||||
|
struct IsCheck
|
||||||
|
{
|
||||||
|
/// Initialize all checks to \c false by default.
|
||||||
|
IsCheck();
|
||||||
|
|
||||||
|
bool isObject_;
|
||||||
|
bool isArray_;
|
||||||
|
bool isBool_;
|
||||||
|
bool isDouble_;
|
||||||
|
bool isInt_;
|
||||||
|
bool isUInt_;
|
||||||
|
bool isIntegral_;
|
||||||
|
bool isNumeric_;
|
||||||
|
bool isString_;
|
||||||
|
bool isNull_;
|
||||||
|
};
|
||||||
|
|
||||||
|
void checkConstMemberCount( const Json::Value &value, unsigned int expectedCount );
|
||||||
|
|
||||||
|
void checkMemberCount( Json::Value &value, unsigned int expectedCount );
|
||||||
|
|
||||||
|
void checkIs( const Json::Value &value, const IsCheck &check );
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, size )
|
||||||
|
{
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyArray_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyObject_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(array1_, 1) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(object1_, 1) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(null_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(integer_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(real_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(emptyString_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(string_, 0) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkMemberCount(true_, 0) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isObject )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isObject_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( emptyObject_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( object1_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isArray )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isArray_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( emptyArray_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( array1_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isNull )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isNull_ = true;
|
||||||
|
checks.isObject_ = true;
|
||||||
|
checks.isArray_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( null_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isString )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isString_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( emptyString_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( string_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( string1_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isBool )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isBool_ = true;
|
||||||
|
checks.isIntegral_ = true;
|
||||||
|
checks.isNumeric_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( false_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( true_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isDouble )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isDouble_ = true;
|
||||||
|
checks.isNumeric_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( real_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isInt )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isInt_ = true;
|
||||||
|
checks.isNumeric_ = true;
|
||||||
|
checks.isIntegral_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( integer_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
JSONTEST_FIXTURE( ValueTest, isUInt )
|
||||||
|
{
|
||||||
|
IsCheck checks;
|
||||||
|
checks.isUInt_ = true;
|
||||||
|
checks.isNumeric_ = true;
|
||||||
|
checks.isIntegral_ = true;
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( unsignedInteger_, checks ) );
|
||||||
|
JSONTEST_ASSERT_PRED( checkIs( smallUnsignedInteger_, checks ) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueTest::checkConstMemberCount( const Json::Value &value, unsigned int expectedCount )
|
||||||
|
{
|
||||||
|
unsigned int count = 0;
|
||||||
|
Json::Value::const_iterator itEnd = value.end();
|
||||||
|
for ( Json::Value::const_iterator it = value.begin(); it != itEnd; ++it )
|
||||||
|
{
|
||||||
|
++count;
|
||||||
|
}
|
||||||
|
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::const_iterator";
|
||||||
|
}
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueTest::checkMemberCount( Json::Value &value, unsigned int expectedCount )
|
||||||
|
{
|
||||||
|
JSONTEST_ASSERT_EQUAL( expectedCount, value.size() );
|
||||||
|
|
||||||
|
unsigned int count = 0;
|
||||||
|
Json::Value::iterator itEnd = value.end();
|
||||||
|
for ( Json::Value::iterator it = value.begin(); it != itEnd; ++it )
|
||||||
|
{
|
||||||
|
++count;
|
||||||
|
}
|
||||||
|
JSONTEST_ASSERT_EQUAL( expectedCount, count ) << "Json::Value::iterator";
|
||||||
|
|
||||||
|
JSONTEST_ASSERT_PRED( checkConstMemberCount(value, expectedCount) );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
ValueTest::IsCheck::IsCheck()
|
||||||
|
: isObject_( false )
|
||||||
|
, isArray_( false )
|
||||||
|
, isBool_( false )
|
||||||
|
, isDouble_( false )
|
||||||
|
, isInt_( false )
|
||||||
|
, isUInt_( false )
|
||||||
|
, isIntegral_( false )
|
||||||
|
, isNumeric_( false )
|
||||||
|
, isString_( false )
|
||||||
|
, isNull_( false )
|
||||||
|
{
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
void
|
||||||
|
ValueTest::checkIs( const Json::Value &value, const IsCheck &check )
|
||||||
|
{
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isObject_, value.isObject() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isArray_, value.isArray() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isBool_, value.isBool() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isDouble_, value.isDouble() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isInt_, value.isInt() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isUInt_, value.isUInt() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isIntegral_, value.isIntegral() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isNumeric_, value.isNumeric() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isString_, value.isString() );
|
||||||
|
JSONTEST_ASSERT_EQUAL( check.isNull_, value.isNull() );
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
int main( int argc, const char *argv[] )
|
||||||
|
{
|
||||||
|
JsonTest::Runner runner;
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, size );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isObject );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isArray );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isBool );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isInt );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isUInt );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isDouble );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isString );
|
||||||
|
JSONTEST_REGISTER_FIXTURE( runner, ValueTest, isNull );
|
||||||
|
return runner.runCommandLine( argc, argv );
|
||||||
|
}
|
10
PowerEditor/src/jsoncpp/src/test_lib_json/sconscript
Normal file
10
PowerEditor/src/jsoncpp/src/test_lib_json/sconscript
Normal file
@ -0,0 +1,10 @@
|
|||||||
|
Import( 'env_testing buildUnitTests' )
|
||||||
|
|
||||||
|
buildUnitTests( env_testing, Split( """
|
||||||
|
main.cpp
|
||||||
|
jsontest.cpp
|
||||||
|
""" ),
|
||||||
|
'test_lib_json' )
|
||||||
|
|
||||||
|
# For 'check' to work, 'libs' must be built first.
|
||||||
|
env_testing.Depends('test_lib_json', '#libs')
|
10
PowerEditor/src/jsoncpp/test/cleantests.py
Normal file
10
PowerEditor/src/jsoncpp/test/cleantests.py
Normal file
@ -0,0 +1,10 @@
|
|||||||
|
# removes all files created during testing
|
||||||
|
import glob
|
||||||
|
import os
|
||||||
|
|
||||||
|
paths = []
|
||||||
|
for pattern in [ '*.actual', '*.actual-rewrite', '*.rewrite', '*.process-output' ]:
|
||||||
|
paths += glob.glob( 'data/' + pattern )
|
||||||
|
|
||||||
|
for path in paths:
|
||||||
|
os.unlink( path )
|
1
PowerEditor/src/jsoncpp/test/data/test_array_01.expected
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_01.expected
Normal file
@ -0,0 +1 @@
|
|||||||
|
.=[]
|
1
PowerEditor/src/jsoncpp/test/data/test_array_01.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_01.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
[]
|
2
PowerEditor/src/jsoncpp/test/data/test_array_02.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_array_02.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=[]
|
||||||
|
.[0]=1
|
1
PowerEditor/src/jsoncpp/test/data/test_array_02.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_02.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
[1]
|
6
PowerEditor/src/jsoncpp/test/data/test_array_03.expected
Normal file
6
PowerEditor/src/jsoncpp/test/data/test_array_03.expected
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
.=[]
|
||||||
|
.[0]=1
|
||||||
|
.[1]=2
|
||||||
|
.[2]=3
|
||||||
|
.[3]=4
|
||||||
|
.[4]=5
|
1
PowerEditor/src/jsoncpp/test/data/test_array_03.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_03.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
[ 1, 2 , 3,4,5]
|
5
PowerEditor/src/jsoncpp/test/data/test_array_04.expected
Normal file
5
PowerEditor/src/jsoncpp/test/data/test_array_04.expected
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
.=[]
|
||||||
|
.[0]=1
|
||||||
|
.[1]="abc"
|
||||||
|
.[2]=12.3
|
||||||
|
.[3]=-4
|
1
PowerEditor/src/jsoncpp/test/data/test_array_04.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_04.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
[1, "abc" , 12.3, -4]
|
100
PowerEditor/src/jsoncpp/test/data/test_array_05.expected
Normal file
100
PowerEditor/src/jsoncpp/test/data/test_array_05.expected
Normal file
@ -0,0 +1,100 @@
|
|||||||
|
.=[]
|
||||||
|
.[0]=1
|
||||||
|
.[1]=2
|
||||||
|
.[2]=3
|
||||||
|
.[3]=4
|
||||||
|
.[4]=5
|
||||||
|
.[5]=6
|
||||||
|
.[6]=7
|
||||||
|
.[7]=8
|
||||||
|
.[8]=9
|
||||||
|
.[9]=10
|
||||||
|
.[10]=11
|
||||||
|
.[11]=12
|
||||||
|
.[12]=13
|
||||||
|
.[13]=14
|
||||||
|
.[14]=15
|
||||||
|
.[15]=16
|
||||||
|
.[16]=17
|
||||||
|
.[17]=18
|
||||||
|
.[18]=19
|
||||||
|
.[19]=20
|
||||||
|
.[20]=21
|
||||||
|
.[21]=22
|
||||||
|
.[22]=23
|
||||||
|
.[23]=24
|
||||||
|
.[24]=25
|
||||||
|
.[25]=26
|
||||||
|
.[26]=27
|
||||||
|
.[27]=28
|
||||||
|
.[28]=29
|
||||||
|
.[29]=30
|
||||||
|
.[30]=31
|
||||||
|
.[31]=32
|
||||||
|
.[32]=33
|
||||||
|
.[33]=34
|
||||||
|
.[34]=35
|
||||||
|
.[35]=36
|
||||||
|
.[36]=37
|
||||||
|
.[37]=38
|
||||||
|
.[38]=39
|
||||||
|
.[39]=40
|
||||||
|
.[40]=41
|
||||||
|
.[41]=42
|
||||||
|
.[42]=43
|
||||||
|
.[43]=44
|
||||||
|
.[44]=45
|
||||||
|
.[45]=46
|
||||||
|
.[46]=47
|
||||||
|
.[47]=48
|
||||||
|
.[48]=49
|
||||||
|
.[49]=50
|
||||||
|
.[50]=51
|
||||||
|
.[51]=52
|
||||||
|
.[52]=53
|
||||||
|
.[53]=54
|
||||||
|
.[54]=55
|
||||||
|
.[55]=56
|
||||||
|
.[56]=57
|
||||||
|
.[57]=58
|
||||||
|
.[58]=59
|
||||||
|
.[59]=60
|
||||||
|
.[60]=61
|
||||||
|
.[61]=62
|
||||||
|
.[62]=63
|
||||||
|
.[63]=64
|
||||||
|
.[64]=65
|
||||||
|
.[65]=66
|
||||||
|
.[66]=67
|
||||||
|
.[67]=68
|
||||||
|
.[68]=69
|
||||||
|
.[69]=70
|
||||||
|
.[70]=71
|
||||||
|
.[71]=72
|
||||||
|
.[72]=73
|
||||||
|
.[73]=74
|
||||||
|
.[74]=75
|
||||||
|
.[75]=76
|
||||||
|
.[76]=77
|
||||||
|
.[77]=78
|
||||||
|
.[78]=79
|
||||||
|
.[79]=80
|
||||||
|
.[80]=81
|
||||||
|
.[81]=82
|
||||||
|
.[82]=83
|
||||||
|
.[83]=84
|
||||||
|
.[84]=85
|
||||||
|
.[85]=86
|
||||||
|
.[86]=87
|
||||||
|
.[87]=88
|
||||||
|
.[88]=89
|
||||||
|
.[89]=90
|
||||||
|
.[90]=91
|
||||||
|
.[91]=92
|
||||||
|
.[92]=93
|
||||||
|
.[93]=94
|
||||||
|
.[94]=95
|
||||||
|
.[95]=96
|
||||||
|
.[96]=97
|
||||||
|
.[97]=98
|
||||||
|
.[98]=99
|
1
PowerEditor/src/jsoncpp/test/data/test_array_05.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_array_05.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99]
|
5
PowerEditor/src/jsoncpp/test/data/test_array_06.expected
Normal file
5
PowerEditor/src/jsoncpp/test/data/test_array_06.expected
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
.=[]
|
||||||
|
.[0]="aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
|
||||||
|
.[1]="bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"
|
||||||
|
.[2]="ccccccccccccccccccccccc"
|
||||||
|
.[3]="dddddddddddddddddddddddddddddddddddddddddddddddddddd"
|
4
PowerEditor/src/jsoncpp/test/data/test_array_06.json
Normal file
4
PowerEditor/src/jsoncpp/test/data/test_array_06.json
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
[ "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
|
||||||
|
"bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb",
|
||||||
|
"ccccccccccccccccccccccc",
|
||||||
|
"dddddddddddddddddddddddddddddddddddddddddddddddddddd" ]
|
1
PowerEditor/src/jsoncpp/test/data/test_basic_01.expected
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_basic_01.expected
Normal file
@ -0,0 +1 @@
|
|||||||
|
.=123456789
|
1
PowerEditor/src/jsoncpp/test/data/test_basic_01.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_basic_01.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
0123456789
|
1
PowerEditor/src/jsoncpp/test/data/test_basic_02.expected
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_basic_02.expected
Normal file
@ -0,0 +1 @@
|
|||||||
|
.=-123456789
|
1
PowerEditor/src/jsoncpp/test/data/test_basic_02.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_basic_02.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
-0123456789
|
3
PowerEditor/src/jsoncpp/test/data/test_basic_03.expected
Normal file
3
PowerEditor/src/jsoncpp/test/data/test_basic_03.expected
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
.=1.2345678
|
||||||
|
|
||||||
|
|
3
PowerEditor/src/jsoncpp/test/data/test_basic_03.json
Normal file
3
PowerEditor/src/jsoncpp/test/data/test_basic_03.json
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
1.2345678
|
||||||
|
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_04.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_04.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.="abcdef"
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_04.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_04.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
"abcdef"
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_05.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_05.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=null
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_05.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_05.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
null
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_06.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_06.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=true
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_06.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_06.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
true
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_07.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_07.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=false
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_07.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_07.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
false
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_08.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_08.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=null
|
||||||
|
|
3
PowerEditor/src/jsoncpp/test/data/test_basic_08.json
Normal file
3
PowerEditor/src/jsoncpp/test/data/test_basic_08.json
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
// C++ style comment
|
||||||
|
null
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_basic_09.expected
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_basic_09.expected
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
.=null
|
||||||
|
|
4
PowerEditor/src/jsoncpp/test/data/test_basic_09.json
Normal file
4
PowerEditor/src/jsoncpp/test/data/test_basic_09.json
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
/* C style comment
|
||||||
|
*/
|
||||||
|
null
|
||||||
|
|
@ -0,0 +1,8 @@
|
|||||||
|
.={}
|
||||||
|
.test=[]
|
||||||
|
.test[0]={}
|
||||||
|
.test[0].a="aaa"
|
||||||
|
.test[1]={}
|
||||||
|
.test[1].b="bbb"
|
||||||
|
.test[2]={}
|
||||||
|
.test[2].c="ccc"
|
8
PowerEditor/src/jsoncpp/test/data/test_comment_01.json
Normal file
8
PowerEditor/src/jsoncpp/test/data/test_comment_01.json
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
{
|
||||||
|
"test":
|
||||||
|
[
|
||||||
|
{ "a" : "aaa" }, // Comment for a
|
||||||
|
{ "b" : "bbb" }, // Comment for b
|
||||||
|
{ "c" : "ccc" } // Comment for c
|
||||||
|
]
|
||||||
|
}
|
20
PowerEditor/src/jsoncpp/test/data/test_complex_01.expected
Normal file
20
PowerEditor/src/jsoncpp/test/data/test_complex_01.expected
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
.={}
|
||||||
|
.attribute=[]
|
||||||
|
.attribute[0]="random"
|
||||||
|
.attribute[1]="short"
|
||||||
|
.attribute[2]="bold"
|
||||||
|
.attribute[3]=12
|
||||||
|
.attribute[4]={}
|
||||||
|
.attribute[4].height=7
|
||||||
|
.attribute[4].width=64
|
||||||
|
.count=1234
|
||||||
|
.name={}
|
||||||
|
.name.aka="T.E.S.T."
|
||||||
|
.name.id=123987
|
||||||
|
.test={}
|
||||||
|
.test.1={}
|
||||||
|
.test.1.2={}
|
||||||
|
.test.1.2.3={}
|
||||||
|
.test.1.2.3.coord=[]
|
||||||
|
.test.1.2.3.coord[0]=1
|
||||||
|
.test.1.2.3.coord[1]=2
|
17
PowerEditor/src/jsoncpp/test/data/test_complex_01.json
Normal file
17
PowerEditor/src/jsoncpp/test/data/test_complex_01.json
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
{
|
||||||
|
"count" : 1234,
|
||||||
|
"name" : { "aka" : "T.E.S.T.", "id" : 123987 },
|
||||||
|
"attribute" : [
|
||||||
|
"random",
|
||||||
|
"short",
|
||||||
|
"bold",
|
||||||
|
12,
|
||||||
|
{ "height" : 7, "width" : 64 }
|
||||||
|
],
|
||||||
|
"test": { "1" :
|
||||||
|
{ "2" :
|
||||||
|
{ "3" : { "coord" : [ 1,2] }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1 @@
|
|||||||
|
.=2147483647
|
2
PowerEditor/src/jsoncpp/test/data/test_integer_01.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_integer_01.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
// Max signed integer
|
||||||
|
2147483647
|
@ -0,0 +1 @@
|
|||||||
|
.=-2147483648
|
2
PowerEditor/src/jsoncpp/test/data/test_integer_02.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_integer_02.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
// Min signed integer
|
||||||
|
-2147483648
|
@ -0,0 +1 @@
|
|||||||
|
.=4294967295
|
2
PowerEditor/src/jsoncpp/test/data/test_integer_03.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_integer_03.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
// Max unsigned integer
|
||||||
|
4294967295
|
@ -0,0 +1,2 @@
|
|||||||
|
.=0
|
||||||
|
|
3
PowerEditor/src/jsoncpp/test/data/test_integer_04.json
Normal file
3
PowerEditor/src/jsoncpp/test/data/test_integer_04.json
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
// Min unsigned integer
|
||||||
|
0
|
||||||
|
|
@ -0,0 +1,2 @@
|
|||||||
|
.=1
|
||||||
|
|
2
PowerEditor/src/jsoncpp/test/data/test_integer_05.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_integer_05.json
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
1
|
||||||
|
|
2122
PowerEditor/src/jsoncpp/test/data/test_large_01.expected
Normal file
2122
PowerEditor/src/jsoncpp/test/data/test_large_01.expected
Normal file
File diff suppressed because it is too large
Load Diff
2
PowerEditor/src/jsoncpp/test/data/test_large_01.json
Normal file
2
PowerEditor/src/jsoncpp/test/data/test_large_01.json
Normal file
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
.={}
|
1
PowerEditor/src/jsoncpp/test/data/test_object_01.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_object_01.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
{}
|
@ -0,0 +1,2 @@
|
|||||||
|
.={}
|
||||||
|
.count=1234
|
1
PowerEditor/src/jsoncpp/test/data/test_object_02.json
Normal file
1
PowerEditor/src/jsoncpp/test/data/test_object_02.json
Normal file
@ -0,0 +1 @@
|
|||||||
|
{ "count" : 1234 }
|
@ -0,0 +1,4 @@
|
|||||||
|
.={}
|
||||||
|
.attribute="random"
|
||||||
|
.count=1234
|
||||||
|
.name="test"
|
5
PowerEditor/src/jsoncpp/test/data/test_object_03.json
Normal file
5
PowerEditor/src/jsoncpp/test/data/test_object_03.json
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"count" : 1234,
|
||||||
|
"name" : "test",
|
||||||
|
"attribute" : "random"
|
||||||
|
}
|
@ -0,0 +1,2 @@
|
|||||||
|
.={}
|
||||||
|
.=1234
|
3
PowerEditor/src/jsoncpp/test/data/test_object_04.json
Normal file
3
PowerEditor/src/jsoncpp/test/data/test_object_04.json
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
{
|
||||||
|
"" : 1234
|
||||||
|
}
|
@ -0,0 +1,3 @@
|
|||||||
|
.={}
|
||||||
|
.first=1
|
||||||
|
.second=2
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user