Merge branch 'master' into spirit-x3
This commit is contained in:
commit
5ef5444fff
14 changed files with 199 additions and 111 deletions
43
CHANGELOG.md
43
CHANGELOG.md
|
@ -6,6 +6,37 @@ Developers: Please commit along with changes.
|
|||
|
||||
For a complete change history, see the git log.
|
||||
|
||||
## 3.0.10
|
||||
|
||||
Released: February 25, 2016
|
||||
|
||||
(Packaged from 5c0d496)
|
||||
|
||||
#### Summary
|
||||
|
||||
- The `shapeindex` command now has a `--index-parts` option. When used the index will be bigger
|
||||
but will allow the Shapefile datasource to only parse polygon parts within the query bounds.
|
||||
- WARNING: index files generated with this newer Mapnik are invalid for older versions of Mapnik.
|
||||
- Any `.index` files accompanying a `.shp` must now be regenerated otherwise
|
||||
it will be skipped. To avoid this problem you can delete the existing `.index` files, or ideally run `shapeindex` to recreate the `.index`. (https://github.com/mapnik/mapnik/pull/3300)
|
||||
The trigger for this change was an optimization that required a new binary format for the shapefile indexes (https://github.com/mapnik/mapnik/pull/3217).
|
||||
- Shapeindex - another fix for skipping `null` shapes (#3288)
|
||||
- Fixed support for filter expressions starting with `not` (https://github.com/mapnik/mapnik/issues/3017)
|
||||
- Ensure `mapped_memory_cache` acts as singleton across shared objects (#3306)
|
||||
- Removed miniz support in PNG encoder (#3281)
|
||||
- Added `-fvisibility=hidden -fvisibility-inlines-hidden` to default compiler flags
|
||||
- Fixed parsing of SVG `PathElement` (https://github.com/mapnik/mapnik/issues/3225)
|
||||
- JSON parsing now supports arbitrary (nested) attributes in `geometry`
|
||||
- Support for rendering `dash-array` in SVGs
|
||||
- SVG parser is now stricter (fails is all input is not parsable) (#3251)
|
||||
- SVG parser now correctly handles optional separator `(,)` between multiple command parts
|
||||
- Optimized parsing of `png` format string
|
||||
- The `memory_datasource` now dynamically reports correct datasource type (vector or raster)
|
||||
- Upgraded `mapbox::variant v1.1.0`
|
||||
- Compare: https://github.com/mapnik/mapnik/compare/v3.0.9...v3.0.10
|
||||
|
||||
|
||||
|
||||
## 3.0.9
|
||||
|
||||
Released: November 26, 2015
|
||||
|
@ -18,12 +49,12 @@ Released: November 26, 2015
|
|||
- Fixed mapnik.util.variant issue when compiling with gcc-5.x and SSO enabled by default (https://github.com/mapnik/mapnik/issues/3103) (via @nkovacs)
|
||||
- Fixed issue with complex scripts where some character sequences weren't rendered correctly (https://github.com/mapnik/mapnik/issues/3050) (via @jkroll20)
|
||||
- Revived postgis.input tests
|
||||
- JSON: geometry grammar has been refactored and optimized to have expectation points
|
||||
- JSON: geometry grammar has been re-factored and optimized to have expectation points
|
||||
- Filled missing specializations for value_bool in `mapnik::value` comparison operators
|
||||
- `mapnik.Image` - fixed copy semantics implementation for internal buffer
|
||||
- JSON parsing: unified error_handler across all grammars
|
||||
- Improved unit test coverage
|
||||
- Raster scaling: fixed nodata handling, acurracy when working with small floats and clipping floats by \[0; 255\] (https://github.com/mapnik/mapnik/pull/3147)
|
||||
- Raster scaling: fixed nodata handling, accuracy when working with small floats and clipping floats by \[0; 255\] (https://github.com/mapnik/mapnik/pull/3147)
|
||||
- Added [`code of conduct`](http://contributor-covenant.org)
|
||||
- GeoJSON plug-in is updated to skip feature with empty geometries
|
||||
- GeoJSON plug-in : ensure original order of features is preserved (fixed) (https://github.com/mapnik/mapnik/issues/3182)
|
||||
|
@ -41,9 +72,9 @@ Released: October 23, 2015
|
|||
|
||||
- Renamed `SHAPE_MEMORY_MAPPED_FILE` define to `MAPNIK_MEMORY_MAPPED_FILE`. Pass `./configure MEMORY_MAPPED_FILE=True|False` to request
|
||||
support for memory mapped files across Mapnik plugins (currently shape, csv, and geojson).
|
||||
- Unified `mapnik-index` utility supporing GeoJSON and CSV formats
|
||||
- Unified `mapnik-index` utility supporting GeoJSON and CSV formats
|
||||
- Increased unit test coverage for GeoJSON and CSV plugins
|
||||
- shape.input - refactor to support *.shx and improve handling various bogus shapefiles
|
||||
- shape.input - re-factor to support *.shx and improve handling various bogus shapefiles
|
||||
- geojson.input - make JSON parser stricter + support single Feature/Geometry as well as FeatureCollection
|
||||
- maintain 'FT_LOAD_NO_HINTING' + support >= harfbuzz 1.0.5
|
||||
- geojson.input - implement on-disk-index support
|
||||
|
@ -109,7 +140,7 @@ Released: August 26, 2015
|
|||
|
||||
#### Summary
|
||||
|
||||
- CSV.input: plug-in has been refactored to minimise memory usage and to improve handling of larger input.
|
||||
- CSV.input: plug-in has been re-factored to minimise memory usage and to improve handling of larger input.
|
||||
(NOTE: [large_csv](https://github.com/mapnik/mapnik/tree/large_csv) branch adds experimental trunsduction parser with deferred string initialisation)
|
||||
- CSV.input: added internal spatial index (boost::geometry::index::tree) for fast `bounding box` queries (https://github.com/mapnik/mapnik/pull/3010)
|
||||
- Fixed deadlock in recursive datasource registration via @zerebubuth (https://github.com/mapnik/mapnik/pull/3038)
|
||||
|
@ -1200,7 +1231,7 @@ Released April 1, 2009
|
|||
|
||||
- Plugins: Use memory mapped files for reading shape file (r628)
|
||||
|
||||
- Core: Use streams to write images (i/o refactor) (r628) (#15)
|
||||
- Core: Use streams to write images (i/o re-factor) (r628) (#15)
|
||||
|
||||
# Mapnik 0.5.1
|
||||
|
||||
|
|
3
Makefile
3
Makefile
|
@ -23,7 +23,8 @@ release:
|
|||
git clone --depth 1 --branch v$${MAPNIK_VERSION} git@github.com:mapnik/mapnik.git $${TARBALL_NAME} && \
|
||||
cd $${TARBALL_NAME} && \
|
||||
git checkout "tags/v$${MAPNIK_VERSION}" && \
|
||||
git submodule update --depth 1 --init && \
|
||||
git submodule update --depth 100 --init && \
|
||||
rm -rf deps/mapbox/variant/.git && \
|
||||
rm -rf test/data/.git && \
|
||||
rm -rf test/data/.gitignore && \
|
||||
rm -rf test/data-visual/.git && \
|
||||
|
|
|
@ -12,7 +12,13 @@ os: Visual Studio 2015
|
|||
# limit clone to latest 5 commits
|
||||
clone_depth: 5
|
||||
|
||||
services:
|
||||
- postgresql94 #if changing this, also change PATH below
|
||||
|
||||
install:
|
||||
- SET PGUSER=postgres
|
||||
- SET PGPASSWORD=Password12!
|
||||
- SET PATH=C:\Program Files\PostgreSQL\9.4\bin\;%PATH%
|
||||
- scripts\build-appveyor.bat
|
||||
|
||||
artifacts:
|
||||
|
|
|
@ -10,6 +10,7 @@
|
|||
|
||||
// stl
|
||||
#include <chrono>
|
||||
#include <cmath> // log10, round
|
||||
#include <cstdio> // snprintf
|
||||
#include <iostream>
|
||||
#include <set>
|
||||
|
@ -19,6 +20,12 @@
|
|||
|
||||
namespace benchmark {
|
||||
|
||||
template <typename T>
|
||||
using milliseconds = std::chrono::duration<T, std::milli>;
|
||||
|
||||
template <typename T>
|
||||
using seconds = std::chrono::duration<T>;
|
||||
|
||||
class test_case
|
||||
{
|
||||
protected:
|
||||
|
@ -92,7 +99,7 @@ inline int parse_args(int argc, char** argv, mapnik::parameters & params)
|
|||
|
||||
inline void handle_common_args(mapnik::parameters const& params)
|
||||
{
|
||||
if (auto severity = params.get<std::string>("log-severity")) {
|
||||
if (auto severity = params.get<std::string>("log")) {
|
||||
if (*severity == "debug")
|
||||
mapnik::logger::set_severity(mapnik::logger::debug);
|
||||
else if (*severity == "warn")
|
||||
|
@ -102,7 +109,7 @@ inline void handle_common_args(mapnik::parameters const& params)
|
|||
else if (*severity == "none")
|
||||
mapnik::logger::set_severity(mapnik::logger::none);
|
||||
else
|
||||
std::clog << "ignoring option --log-severity='" << *severity
|
||||
std::clog << "ignoring option --log='" << *severity
|
||||
<< "' (allowed values are: debug, warn, error, none)\n";
|
||||
}
|
||||
}
|
||||
|
@ -134,6 +141,29 @@ inline int handle_args(int argc, char** argv, mapnik::parameters & params)
|
|||
} \
|
||||
} \
|
||||
|
||||
struct big_number_fmt
|
||||
{
|
||||
int w;
|
||||
double v;
|
||||
const char* u;
|
||||
|
||||
big_number_fmt(int width, double value, int base = 1000)
|
||||
: w(width), v(value), u("")
|
||||
{
|
||||
static const char* suffixes = "\0\0k\0M\0G\0T\0P\0E\0Z\0Y\0\0";
|
||||
u = suffixes;
|
||||
|
||||
while (v > 1 && std::log10(std::round(v)) >= width && u[2])
|
||||
{
|
||||
v /= base;
|
||||
u += 2;
|
||||
}
|
||||
|
||||
// adjust width for proper alignment without suffix
|
||||
w += (u == suffixes);
|
||||
}
|
||||
};
|
||||
|
||||
template <typename T>
|
||||
int run(T const& test_runner, std::string const& name)
|
||||
{
|
||||
|
@ -156,21 +186,43 @@ int run(T const& test_runner, std::string const& name)
|
|||
auto opt_min_duration = test_runner.params().template get<double>("min-duration", 0.0);
|
||||
std::chrono::duration<double> min_seconds(*opt_min_duration);
|
||||
auto min_duration = std::chrono::duration_cast<decltype(elapsed)>(min_seconds);
|
||||
std::size_t loops = 0;
|
||||
auto num_iters = test_runner.iterations();
|
||||
auto num_threads = test_runner.threads();
|
||||
auto total_iters = 0;
|
||||
|
||||
if (test_runner.threads() > 0)
|
||||
if (num_threads > 0)
|
||||
{
|
||||
using thread_group = std::vector<std::unique_ptr<std::thread> >;
|
||||
using value_type = thread_group::value_type;
|
||||
thread_group tg;
|
||||
for (std::size_t i=0;i<test_runner.threads();++i)
|
||||
std::mutex mtx_ready;
|
||||
std::unique_lock<std::mutex> lock_ready(mtx_ready);
|
||||
|
||||
auto stub = [&](T const& test_copy)
|
||||
{
|
||||
tg.emplace_back(new std::thread(test_runner));
|
||||
// workers will wait on this mutex until the main thread
|
||||
// constructs all of them and starts measuring time
|
||||
std::unique_lock<std::mutex> my_lock(mtx_ready);
|
||||
my_lock.unlock();
|
||||
test_copy();
|
||||
};
|
||||
|
||||
std::vector<std::thread> tg;
|
||||
tg.reserve(num_threads);
|
||||
for (auto i = num_threads; i-- > 0; )
|
||||
{
|
||||
tg.emplace_back(stub, test_runner);
|
||||
}
|
||||
start = std::chrono::high_resolution_clock::now();
|
||||
std::for_each(tg.begin(), tg.end(), [](value_type & t) {if (t->joinable()) t->join();});
|
||||
lock_ready.unlock();
|
||||
// wait for all workers to finish
|
||||
for (auto & t : tg)
|
||||
{
|
||||
if (t.joinable())
|
||||
t.join();
|
||||
}
|
||||
elapsed = std::chrono::high_resolution_clock::now() - start;
|
||||
loops = 1;
|
||||
// this is actually per-thread count, not total, but I think
|
||||
// reporting average 'iters/thread/second' is more useful
|
||||
// than 'iters/second' multiplied by the number of threads
|
||||
total_iters += num_iters;
|
||||
}
|
||||
else
|
||||
{
|
||||
|
@ -178,39 +230,25 @@ int run(T const& test_runner, std::string const& name)
|
|||
do {
|
||||
test_runner();
|
||||
elapsed = std::chrono::high_resolution_clock::now() - start;
|
||||
++loops;
|
||||
total_iters += num_iters;
|
||||
} while (elapsed < min_duration);
|
||||
}
|
||||
|
||||
double iters = loops * test_runner.iterations();
|
||||
double dur_total = std::chrono::duration<double, std::milli>(elapsed).count();
|
||||
double dur_avg = dur_total / iters;
|
||||
char iters_unit = ' ';
|
||||
char msg[200];
|
||||
|
||||
if (iters >= 1e7) iters *= 1e-6, iters_unit = 'M';
|
||||
else if (iters >= 1e4) iters *= 1e-3, iters_unit = 'k';
|
||||
double dur_total = milliseconds<double>(elapsed).count();
|
||||
auto elapsed_nonzero = std::max(elapsed, decltype(elapsed){1});
|
||||
big_number_fmt itersf(4, total_iters);
|
||||
big_number_fmt ips(5, total_iters / seconds<double>(elapsed_nonzero).count());
|
||||
|
||||
std::snprintf(msg, sizeof(msg),
|
||||
"%-43s %3zu threads %4.0f%c iters %6.0f milliseconds",
|
||||
"%-43s %3zu threads %*.0f%s iters %6.0f milliseconds %*.0f%s i/s\n",
|
||||
name.c_str(),
|
||||
test_runner.threads(),
|
||||
iters, iters_unit,
|
||||
dur_total);
|
||||
num_threads,
|
||||
itersf.w, itersf.v, itersf.u,
|
||||
dur_total,
|
||||
ips.w, ips.v, ips.u
|
||||
);
|
||||
std::clog << msg;
|
||||
|
||||
// log average time per iteration, currently only for non-threaded runs
|
||||
if (test_runner.threads() == 0)
|
||||
{
|
||||
char unit = 'm';
|
||||
if (dur_avg < 1e-5) dur_avg *= 1e+9, unit = 'p';
|
||||
else if (dur_avg < 1e-2) dur_avg *= 1e+6, unit = 'n';
|
||||
else if (dur_avg < 1e+1) dur_avg *= 1e+3, unit = 'u';
|
||||
std::snprintf(msg, sizeof(msg), " %4.0f%cs/iter", dur_avg, unit);
|
||||
std::clog << msg;
|
||||
}
|
||||
|
||||
std::clog << "\n";
|
||||
return 0;
|
||||
}
|
||||
catch (std::exception const& ex)
|
||||
|
|
|
@ -6,8 +6,14 @@ source ./localize.sh
|
|||
|
||||
BASE=./benchmark/out
|
||||
function run {
|
||||
${BASE}/$1 --threads 0 --iterations $3;
|
||||
${BASE}/$1 --threads $2 --iterations $(expr $3 / $2);
|
||||
local runner="$BASE/$1 --log=none"
|
||||
local threads="$2"
|
||||
local iters="$3"
|
||||
shift 3
|
||||
$runner --threads 0 --iterations $iters "$@"
|
||||
if test $threads -gt 0; then
|
||||
$runner --threads $threads --iterations $((iters/threads)) "$@"
|
||||
fi
|
||||
}
|
||||
run test_getline 30 10000000
|
||||
#run test_array_allocation 20 100000
|
||||
|
|
|
@ -90,7 +90,7 @@ struct csv_line_grammar : qi::grammar<Iterator, csv_line(char, char), Skipper>
|
|||
("\\\"", '\"')
|
||||
("\"\"", '\"') // double quote
|
||||
;
|
||||
line = -lit("\n\r") >> column(_r1, _r2) % lit(_r1)
|
||||
line = -lit("\r") >> -lit("\n") >> column(_r1, _r2) % lit(_r1)
|
||||
;
|
||||
column = quoted(_r2) | *(char_ - lit(_r1))
|
||||
;
|
||||
|
|
|
@ -25,7 +25,7 @@
|
|||
|
||||
#define MAPNIK_MAJOR_VERSION 3
|
||||
#define MAPNIK_MINOR_VERSION 0
|
||||
#define MAPNIK_PATCH_VERSION 9
|
||||
#define MAPNIK_PATCH_VERSION 10
|
||||
|
||||
#define MAPNIK_VERSION (MAPNIK_MAJOR_VERSION*100000) + (MAPNIK_MINOR_VERSION*100) + (MAPNIK_PATCH_VERSION)
|
||||
|
||||
|
|
|
@ -200,6 +200,20 @@ const mapnik::json::feature_grammar<base_iterator_type, mapnik::feature_impl> ge
|
|||
const mapnik::json::extract_bounding_box_grammar<base_iterator_type> geojson_datasource_static_bbox_grammar;
|
||||
}
|
||||
|
||||
void geojson_datasource::initialise_descriptor(mapnik::feature_ptr const& feature)
|
||||
{
|
||||
for ( auto const& kv : *feature)
|
||||
{
|
||||
auto const& name = std::get<0>(kv);
|
||||
if (!desc_.has_name(name))
|
||||
{
|
||||
desc_.add_descriptor(mapnik::attribute_descriptor(name,
|
||||
mapnik::util::apply_visitor(attr_value_converter(),
|
||||
std::get<1>(kv))));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void geojson_datasource::initialise_disk_index(std::string const& filename)
|
||||
{
|
||||
// read extent
|
||||
|
@ -213,7 +227,7 @@ void geojson_datasource::initialise_disk_index(std::string const& filename)
|
|||
std::vector<value_type> positions;
|
||||
mapnik::util::spatial_index<value_type,
|
||||
mapnik::filter_in_box,
|
||||
std::ifstream>::query_first_n(filter, index, positions, 5);
|
||||
std::ifstream>::query_first_n(filter, index, positions, num_features_to_query_);
|
||||
|
||||
mapnik::util::file file(filename_);
|
||||
if (!file.open()) throw mapnik::datasource_exception("GeoJSON Plugin: could not open: '" + filename_ + "'");
|
||||
|
@ -236,16 +250,7 @@ void geojson_datasource::initialise_disk_index(std::string const& filename)
|
|||
{
|
||||
throw std::runtime_error("Failed to parse geojson feature");
|
||||
}
|
||||
for ( auto const& kv : *feature)
|
||||
{
|
||||
auto const& name = std::get<0>(kv);
|
||||
if (!desc_.has_name(name))
|
||||
{
|
||||
desc_.add_descriptor(mapnik::attribute_descriptor(name,
|
||||
mapnik::util::apply_visitor(attr_value_converter(),
|
||||
std::get<1>(kv))));
|
||||
}
|
||||
}
|
||||
initialise_descriptor(feature);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -285,20 +290,18 @@ void geojson_datasource::initialise_index(Iterator start, Iterator end)
|
|||
if (geometry_index == 0)
|
||||
{
|
||||
extent_ = box;
|
||||
for ( auto const& kv : *f)
|
||||
{
|
||||
desc_.add_descriptor(mapnik::attribute_descriptor(std::get<0>(kv),
|
||||
mapnik::util::apply_visitor(attr_value_converter(),
|
||||
std::get<1>(kv))));
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
extent_.expand_to_include(box);
|
||||
}
|
||||
values.emplace_back(box, std::make_pair(geometry_index,0));
|
||||
|
||||
}
|
||||
if (geometry_index++ < num_features_to_query_)
|
||||
{
|
||||
initialise_descriptor(f);
|
||||
}
|
||||
++geometry_index;
|
||||
}
|
||||
// packing algorithm
|
||||
tree_ = std::make_unique<spatial_index_type>(values);
|
||||
|
@ -308,14 +311,16 @@ void geojson_datasource::initialise_index(Iterator start, Iterator end)
|
|||
// bulk insert initialise r-tree
|
||||
tree_ = std::make_unique<spatial_index_type>(boxes);
|
||||
// calculate total extent
|
||||
std::size_t feature_count = 0;
|
||||
for (auto const& item : boxes)
|
||||
{
|
||||
auto const& box = std::get<0>(item);
|
||||
auto const& geometry_index = std::get<1>(item);
|
||||
if (!extent_.valid())
|
||||
if (!extent_.valid()) extent_ = box;
|
||||
else extent_.expand_to_include(box);
|
||||
if (feature_count++ < num_features_to_query_)
|
||||
{
|
||||
extent_ = box;
|
||||
// parse first feature to extract attributes schema.
|
||||
// parse first N features to extract attributes schema.
|
||||
// NOTE: this doesn't yield correct answer for geoJSON in general, just an indication
|
||||
Iterator itr2 = start + geometry_index.first;
|
||||
Iterator end2 = itr2 + geometry_index.second;
|
||||
|
@ -327,16 +332,8 @@ void geojson_datasource::initialise_index(Iterator start, Iterator end)
|
|||
{
|
||||
throw std::runtime_error("Failed to parse geojson feature");
|
||||
}
|
||||
for ( auto const& kv : *feature)
|
||||
{
|
||||
desc_.add_descriptor(mapnik::attribute_descriptor(std::get<0>(kv),
|
||||
mapnik::util::apply_visitor(attr_value_converter(),
|
||||
std::get<1>(kv))));
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
extent_.expand_to_include(box);
|
||||
|
||||
initialise_descriptor(feature);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -399,12 +396,7 @@ void geojson_datasource::parse_geojson(Iterator start, Iterator end)
|
|||
if (geometry_index == 0)
|
||||
{
|
||||
extent_ = box;
|
||||
for ( auto const& kv : *f)
|
||||
{
|
||||
desc_.add_descriptor(mapnik::attribute_descriptor(std::get<0>(kv),
|
||||
mapnik::util::apply_visitor(attr_value_converter(),
|
||||
std::get<1>(kv))));
|
||||
}
|
||||
|
||||
}
|
||||
else
|
||||
{
|
||||
|
@ -412,6 +404,10 @@ void geojson_datasource::parse_geojson(Iterator start, Iterator end)
|
|||
}
|
||||
values.emplace_back(box, std::make_pair(geometry_index,0));
|
||||
}
|
||||
if (geometry_index < num_features_to_query_)
|
||||
{
|
||||
initialise_descriptor(f);
|
||||
}
|
||||
++geometry_index;
|
||||
}
|
||||
// packing algorithm
|
||||
|
@ -419,7 +415,7 @@ void geojson_datasource::parse_geojson(Iterator start, Iterator end)
|
|||
|
||||
}
|
||||
|
||||
geojson_datasource::~geojson_datasource() { }
|
||||
geojson_datasource::~geojson_datasource() {}
|
||||
|
||||
const char * geojson_datasource::name()
|
||||
{
|
||||
|
@ -454,7 +450,7 @@ boost::optional<mapnik::datasource_geometry_t> geojson_datasource::get_geometry_
|
|||
std::vector<value_type> positions;
|
||||
mapnik::util::spatial_index<value_type,
|
||||
mapnik::filter_in_box,
|
||||
std::ifstream>::query_first_n(filter, index, positions, 5);
|
||||
std::ifstream>::query_first_n(filter, index, positions, num_features_to_query_);
|
||||
|
||||
mapnik::util::file file(filename_);
|
||||
|
||||
|
@ -494,7 +490,7 @@ boost::optional<mapnik::datasource_geometry_t> geojson_datasource::get_geometry_
|
|||
else if (cache_features_)
|
||||
{
|
||||
unsigned num_features = features_.size();
|
||||
for (unsigned i = 0; i < num_features && i < 5; ++i)
|
||||
for (unsigned i = 0; i < num_features && i < num_features_to_query_; ++i)
|
||||
{
|
||||
result = mapnik::util::to_ds_type(features_[i]->get_geometry());
|
||||
if (result)
|
||||
|
@ -519,7 +515,7 @@ boost::optional<mapnik::datasource_geometry_t> geojson_datasource::get_geometry_
|
|||
auto itr = tree_->qbegin(boost::geometry::index::intersects(extent_));
|
||||
auto end = tree_->qend();
|
||||
mapnik::context_ptr ctx = std::make_shared<mapnik::context_type>();
|
||||
for (std::size_t count = 0; itr !=end && count < 5; ++itr,++count)
|
||||
for (std::size_t count = 0; itr !=end && count < num_features_to_query_; ++itr,++count)
|
||||
{
|
||||
geojson_datasource::item_type const& item = *itr;
|
||||
std::size_t file_offset = item.second.first;
|
||||
|
|
|
@ -94,6 +94,7 @@ public:
|
|||
void initialise_index(Iterator start, Iterator end);
|
||||
void initialise_disk_index(std::string const& filename);
|
||||
private:
|
||||
void initialise_descriptor(mapnik::feature_ptr const&);
|
||||
mapnik::datasource::datasource_t type_;
|
||||
mapnik::layer_descriptor desc_;
|
||||
std::string filename_;
|
||||
|
@ -103,6 +104,7 @@ private:
|
|||
std::unique_ptr<spatial_index_type> tree_;
|
||||
bool cache_features_ = true;
|
||||
bool has_disk_index_ = false;
|
||||
const std::size_t num_features_to_query_ = 5;
|
||||
};
|
||||
|
||||
|
||||
|
|
|
@ -22,6 +22,8 @@ ECHO msvs_toolset^: %msvs_toolset%
|
|||
SET BUILD_TYPE=%configuration%
|
||||
SET BUILDPLATFORM=%platform%
|
||||
SET TOOLS_VERSION=%msvs_toolset%.0
|
||||
SET ICU_VERSION=56.1
|
||||
ECHO ICU_VERSION^: %ICU_VERSION%
|
||||
IF DEFINED APPVEYOR (ECHO on AppVeyor) ELSE (ECHO NOT on AppVeyor)
|
||||
ECHO ========
|
||||
|
||||
|
@ -32,6 +34,16 @@ SET PATH=C:\Program Files\7-Zip;%PATH%
|
|||
git submodule update --init deps/mapbox/variant
|
||||
IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
||||
|
||||
::python bindings, including test data
|
||||
IF NOT EXIST bindings\python git clone --recursive https://github.com/mapnik/python-mapnik.git bindings/python
|
||||
IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
||||
CD bindings\python & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
git fetch & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
git pull & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
CD ..\.. & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
||||
::cloning mapnik-gyp
|
||||
if EXIST mapnik-gyp ECHO mapnik-gyp already cloned && GOTO MAPNIK_GYP_ALREADY_HERE
|
||||
CALL git clone https://github.com/mapnik/mapnik-gyp.git
|
||||
|
|
|
@ -33,14 +33,6 @@ SET msvs_toolset=14
|
|||
SET platform=x64
|
||||
SET APPVEYOR_BUILD_FOLDER=%CD%
|
||||
|
||||
IF NOT EXIST bindings\python git clone https://github.com/mapnik/python-mapnik.git bindings/python
|
||||
IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
||||
CD bindings\python & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
git fetch & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
git pull & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
CD ..\.. & IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
||||
ECHO pulling test data
|
||||
CALL git submodule update --init
|
||||
IF %ERRORLEVEL% NEQ 0 GOTO ERROR
|
||||
|
|
|
@ -158,6 +158,16 @@ TEST_CASE("expressions")
|
|||
TRY_CHECK(eval(" [int] > 100 and [int] gt 100.0 and [double] < 2 and [double] lt 2.0 ") == true);
|
||||
TRY_CHECK(eval(" [int] >= 123 and [int] ge 123.0 and [double] <= 1.23456 and [double] le 1.23456 ") == true);
|
||||
|
||||
// empty string/null equality
|
||||
TRY_CHECK(eval("[null] = null") == true);
|
||||
TRY_CHECK(eval("[null] != null") == false);
|
||||
TRY_CHECK(eval("[null] = ''") == false);
|
||||
///////////////////// ref: https://github.com/mapnik/mapnik/issues/1859
|
||||
TRY_CHECK(eval("[null] != ''") == false); // back compatible - will be changed in 3.1.x
|
||||
//////////////////////
|
||||
TRY_CHECK(eval("'' = [null]") == false);
|
||||
TRY_CHECK(eval("'' != [null]") == true);
|
||||
|
||||
// regex
|
||||
// replace
|
||||
TRY_CHECK(eval(" [foo].replace('(\\B)|( )','$1 ') ") == tr.transcode("b a r"));
|
||||
|
|
|
@ -663,20 +663,8 @@ TEST_CASE("geojson") {
|
|||
auto ds = mapnik::datasource_cache::instance().create(params);
|
||||
REQUIRE(bool(ds));
|
||||
auto fields = ds->get_descriptor().get_descriptors();
|
||||
// TODO: this combo (cache_features==false and create_index==false)
|
||||
// exposes the case where not all field names are reported, which should
|
||||
// ideally be fixed, but how?
|
||||
if (cache_features == false && create_index == false)
|
||||
{
|
||||
std::initializer_list<std::string> names = {"one"};
|
||||
REQUIRE_FIELD_NAMES(fields, names);
|
||||
}
|
||||
else
|
||||
{
|
||||
std::initializer_list<std::string> names = {"one", "two"};
|
||||
REQUIRE_FIELD_NAMES(fields, names);
|
||||
}
|
||||
|
||||
std::initializer_list<std::string> names = {"one", "two"};
|
||||
REQUIRE_FIELD_NAMES(fields, names);
|
||||
}
|
||||
// cleanup
|
||||
if (create_index && mapnik::util::exists(filename + ".index"))
|
||||
|
|
|
@ -35,6 +35,7 @@
|
|||
#endif
|
||||
|
||||
#include <fstream>
|
||||
#include <iomanip>
|
||||
|
||||
namespace mapnik { namespace detail {
|
||||
|
||||
|
@ -89,7 +90,12 @@ std::pair<bool,box2d<double>> process_csv_file(T & boxes, std::string const& fil
|
|||
|
||||
::detail::geometry_column_locator locator;
|
||||
std::vector<std::string> headers;
|
||||
std::clog << "Parsing CSV using SEPARATOR=" << separator << " QUOTE=" << quote << std::endl;
|
||||
std::clog << std::showbase << std::internal << std::setfill('0') ;
|
||||
std::clog << "Parsing CSV using"
|
||||
<< " NEWLINE=" << std::hex << std::setw(4) << int(newline) << std::dec
|
||||
<< " SEPARATOR=" << separator
|
||||
<< " QUOTE=" << quote << std::endl;
|
||||
|
||||
if (!manual_headers.empty())
|
||||
{
|
||||
std::size_t index = 0;
|
||||
|
|
Loading…
Add table
Reference in a new issue