Search for places, control tiles, routes and items. Ability to convert from WGS 84 Coordinates to Control Tile/ US Army MGRS.
MapColonies has its own Control Reference System. Like the US Army MGRS, we divided our user's area of interest to Tiles. Each tile is 10kmX10km and has Sub-Tiles which are 1kmX1km. Each tile has 100 sub-tiles.
A tile's name is exactly 3-letters, while a Sub-Tile is a 2-digit number.

The API exposes 5 major routes.
/search/query
/search/location
/search/control
/lookup/coordinates
/search/MGRS/tiles
Each route provides a different purpose.
/search/query
enables users to search anything. Based on a set of Regular Expressions, it will "navigate"/ proxy the request to the right route.
/search//location
enables users to search locations via text-based search and filter them by source, region, geo context, and more. This route also enables users to get available sources and regions and you can filter on as they might change in the future.
/search/control
enables users to search on MapColonies Control Grid. Users can search for Tiles, Items and Routes.
/lookup/coordinates
enables users to convert WGS84 coordinates and decide whether the response will be in MapColonies Control Tile our US Army MGRS Tile.
/search/MGRS/tiles
enables users to convert a US Army MGRS Tile to a GeoJSON Feature.
Almost all of our routes consists of the same common query parameters: geo_context
, geo_context_mode
, limit
and disable_fuzziness
.
Query Parameter | Type | Default Value | Usage Explanation |
---|---|---|---|
geo_context | Bounding Box, WGS84 Circle, UTM Circle | undefined |
Via this param you can provide the search engine for geo context of the search. |
geo_context_mode | Enum(filter ,bias ) |
undefined |
Via this param you tell the search engine what to do with geo_context . You can filter results (which will result with every feature that matches the query and intersects with geo_context shape) or you can bias the results. So features that intersect with the geo_context will appear first. |
limit | Number | 5 | By default, we will return our top 5 features that match the query. You can change the limit and set it from 1 to 15 maximum results. If there are few results, the response may contain less than limit, but the importance is limiting the maximum returned values. |
disable_fuzziness | Boolean | false | Fuzziness is on by default. If you want exact match, you may set disable_fuzziness: true . |
Important
We also have Feedback API. Each request is sent back with x-req-id which is the identifier of the request. We kindly ask our users to provide us with a request to Feedback API which contains x-api-key and clicked response. It enables us to research the request and response to be more accurate. Feedback API source code is built in a different repository at <TODO: ADD LINK TO FEEDBACK API REPO>. Geocoding API inserts the request and response to Redis before the response is sent.
Setup Elasticsearch and S3 provider (For local environment, Minio as a personal recommendation). Containered Elasticsearch:
docker run -d --name elasticsearch -p 9200:9200 -e "discovery.type=single-node" -e "xpack.security.enabled=false" -e "xpack.security.enrollment.enabled=false" -e "ELASTIC_CONTAINER=true" elasticsearch:8.13.0
(optinal) Containered Kibana:
docker run -d --name kibana -p 5601:5601 -e "ELASTIC_CONTAINER=true" kibana:8.13.0
Containered Minio:
docker run -d --name minio -p 9000:9000 -p 9001:9001 -e "MINIO_ROOT_USER=minio" -e "MINIO_ROOT_PASSWORD=minio123" quay.io/minio/minio server /data --console-address ":9001"
Note
Right to date (September 18th 2024), Elasticsearch's default username is elastic
and password is changeme
.
NLP Analyzer:
We use 3rd party software in order to exctact the searched placetype name from the string.
Here is a mock service that will preduce the somewhat expected response from the NLP Analyzer.
const express = require('express');
const app = express();
app.set('port', 5000);
app.use(express.json({ limit: '50mb' }));
app.use(express.urlencoded({ limit: '50mb', extended: true }));
app.post('/NLP_ANALYSES', (req, res) => {
if (
!req.body.tokens?.find(
(token) =>
token.toUpperCase() === 'USA' ||
token.toUpperCase() === 'NEW' ||
token.toUpperCase() === 'LOS' ||
token.toUpperCase() === 'PARIS' ||
token.toUpperCase() === 'FRANCE'
)
) {
return res.status(200).json([
{
tokens: req.body.tokens,
prediction: req.body.tokens.map((_) => 'essence'),
},
]);
}
return res.status(200).json([
{
tokens: req.body.tokens,
prediction: req.body.tokens.map((_, index) => {
if (index + 1 === req.body.tokens.length) return 'name';
else return 'essence';
}),
},
]);
});
app.listen(app.get('port'), () => {
console.log(`Server is running on port ${app.get('port')}`);
});
Install mock data - Don't forget to edit /config/test.json
and /config/default.json
file to your specific specific config.
npm run dev:scripts
Install deps with npm
npm install
npx husky install
Clone the project
git clone https://github.com/MapColonies/geocoding.git
Go to the project directory
cd geocoding
Install dependencies
npm install
Start the server
npm run start
To run tests, run the following command
npm run test
To only run unit tests:
npm run test:unit
To only run integration tests:
npm run test:integration