Hacks & maps @ #Hack4NO

Thanks to Wikistipend by Wikimedia Norge I had chance to join Nordic Open Geo Data Gathering and #Hack4NO in Hønefoss, Norway.

Nordic Open Geo Data Gathering

The meeting at Kartverket on 27th of November was participated by me, Susanna Ånäs with her son and Esa Tiainen from Finland, Albin Larsson from Sweden, Vahur Puik from Estonia, Astrid Carlsen and Jon Harald Søby from Wikimedia Norge, and by many other people.

Nordic Open Geo Data Gathering participants at Norwegian mapping authority Kartverket's headquarters in Hønefoss, Norway

Nordic Open Geo Data Gathering participants at Norwegian mapping authority Kartverket’s headquarters in Hønefoss, Norway.

Nordic Open Geo Data Gathering participants at Norwegian mapping authority Kartverket's headquarters in Hønefoss, Norway

Nordic Open Geo Data Gathering participants at Norwegian mapping authority Kartverket’s headquarters in Hønefoss, Norway.

During the meeting we listened many interesting presentations as brought out in the blog post by Susanna Ånäs. She also suggested that a Wikimedia affiliate, i.e. kind of user group that is focused on mapping related activities in Wikimedia would be founded. There was also discussion on people’s interests that Susanna collected to the meeting notes.

#hack4no

I and some other  Nordic Open Geo Data Gathering participants stayed a bit longer in Hønefoss to join #hack4no. Many applications were published in the hackathon and all the submissions can be found at hack4no2016.devpost.com.

I and Jon Harald Søby from Wikimedia Norge submitted the WikiOSMark application.  Thank you again for the Wikistipend by Wikimedia Norge that made this possible.

Map of the WikiOSMark app

WikiOSMark app with Wikimedia items on the map.

Tha main idea of the application is to enrich Wikimedia with OpenStreetMap and vice versa. For example, the aim is to allow users to add Wikidata tags to the OpenStreetMap elements. Also, adding coordinates, for example, to Wikidata items is useful. For the prototype application, Wikimedia and OpenStreetMap test sites are used. Also, user can choose to view other open data from various Nordic countries such as from platsr.se, apis.is and Flickr on the map. This information is also useful when doing the OpenStreetMap and Wikimedia edits. Finally, the application can show Wheelmap.org places. Wheelmap.org helps to add information of wheelchair accessibility to OpenStreetMap. While editing OpenStreetMap and Wikimedia, why not consider also disabled people?

To mention some other work, Vahur Puik from Estonia submitted an “AI tool to filter out indoor and outdoor images from public digital photographic collections” that is meant to be used, for example with Sift.pics application. There were also many other maps related submissions.

All in all interesting events!

Maps & hacks @ #Hack4NO

Around 30 participants gathered in the Nordic Open Geodata Meetup at the Norwegian mapping authority Kartverket’s headquarters in Hønefoss, Norway. The participants convened from around the Nordics: Norway, Sweden, Finland and Estonia.

One reason for the gathering was to chart ways of networking about open geodata in the Nordic countries. Another aspect was to discuss ideas for the #Hack4NO hackathon that followed the next day.

New networks were not set up during the days but the ideas are maturing. In Finland, mappers are finding ways to get organized, and the Wikimaps project is setting up a Wikimaps user group.

This is intended to be an evolving blog post as people are sending in their documentation and thoughts. Check back later as well!

Project presentations

Humanitarian mapping

Erno Mäkinen presented Humanitarian OpenStreetMap Team tools and activities, focusing on the recent growth of especially the humanitarian mapping events in Finland. Mapping events are great at bringing people together, in particular when the action has direct impact. The tools for humanitarian projects are same as for other kinds of mapping.

Interactive maps on Wikipedia

Albin Larsson presented the new mapping possibilites in MediaWiki projects with the Kartographer extension’s <maplink> and <mapframe> tags, and using SPARQL queries in Wikidata and geographic shapes from OSM to create maps.

Crowdsourced geotagging of images

Vahur Puik presented the gamified environment Ajapaik for crowdsourcing locations for old photographs, and the forthcoming Norwegian instance Fotodugnad.

Combining aspects of humanitarian mapping and working with cultural heritage

Susanna Ånäs proposed new configurations for cultural projects, combining humanitarian work with existing GLAM-Wiki activities in Wikimedia.

Georef – place names linking data

Esa Tiainen presented Georef – a new Finnish initiative to organise Finnish place names – both current and historical – as linked data.

Susanna Ånäs also briefly showed WikiProject Historical Place, which is a project to model historical places in Wikidata. We need authorities, researchers and volunteers working together to bring knowledge about historical places into Wikidata!

Mining Heritage Site

Bjørn Ivar Berg presented the case of the Norwegian Mining Museum’s rich archival content.

K-Lab map demonstrators

Vemund Olstad presented the K-Lab map demonstrators and how they have been used in the project En Blå Tråd.

Lokalhistoriewiki.no – Local history wiki

Marianne Wiig presented Lokalhistoriewiki, a wiki that includes rich information about people and places in Norway.

Mapillary – crowdsourced street view

Peter Neubauer presented Mapillary, the crowdsourced street view and their new plans to offer the users of Panoramio the opportunity to migrate their images to Mapillary. Mapillary connects with OpenStreetMap, and there have been ideas to connect historical images through Mapillary in the same way.

Maptime communities

Lasse Sætre briefly described his work with mapping community Maptime Oslo. Maptime chapters exist in many Nordic cities, and they connect activities between mappers, designers, open data enthusiasts etc.

Wikimaps user group in Wikimedia

Susanna Ånäs proposed a user group for mapping related activities in Wikimedia. A user group is not an organization, but it would be recognized as a Wikimedia affiliate. It could give visibility and emphasis to mapping activities, and help channel the discussions about development issues or community needs.

Would you have interest in the group? Do you think you could benefit from it? Would you be interested in being a founding member?

hands-600497_1920

Nordic open geodata gathering

The Nordic open geodata gathering will be arranged in the context of #Hack4NO in Hønefoss, Norway, on 27 October. Come with us to cross bridges between countries, communities and practices in open geodata! The program will touch working with historical maps, locations and photographs, humanitarian mapping and the new Wikimedia mapping technologies that make use of Wikidata and OpenStreetMap. You are also welcome to present your projects and collaboration ideas!

When: 27th October 2016
Where: Norwegian Mapping Authority (Kartverksveien 21, Hønefoss, Norway – one hour outside Oslo)
Sign up: Sign up using the #hack4NO registration form
Fee: Free of charge (lunch included)
Contact persons: Susanna Ånäs (susanna.anas@gmail.com), Miska Knapek (miska@knapek.org), Erno Mäkinen (ernoma@gmail.com), Zeljka Jakir (zeljka.jakir@kartverket.no)

Programme

8.30–9.00 Registration

9.00–10.15 Presentations
Why network in the Nordic countries (Miska Knapek)
Humanitarian mapping in Finland (Erno Mäkinen)
Interactive maps on Wikipedia – Why and how? (Albin Larsson)
Crowdsourced geotagging of images (Vahur Puik)
Combining aspects of humanitarian mapping and working with cultural heritage – humanitarian GLAM? (Susanna Ånäs)

10.15–11.45 Open ideas
Present or propose collaboration or a hack! If you are interested in presenting in the program, please contact the organizers. You may jot down your suggestion in this pad. This will also become the documentation of the event.

11.45–12.45 Lunch

12.45–13.45 Workshops 1 (parallel sessions)
Wikimedia maps and SparQL (Albin Larsson)
Working with historical maps (Lars Jynge Alvik, Lars Rogstad, Susanna Ånäs)

13.45–14.45 Workshops 2 (parallel sessions)
Humanitarian mapping via HOT tasking manager (Erno Mäkinen)
Geotagging old images (Vahur Puik, Vemund Olstad)

14.45–15.15 Coffee break

15.15–16.30 Discussion
Building networks for continued collaboration, projects and hacks. Find your soulmates!

18.00 Dinner in Hønefoss (own cost)

Bridging communities in the Nordic region

The various volunteered geographic information communities in the Nordic countries have nothing but to benefit from each other. Geolocation – also in the historical context – is becoming an essential part of the knowledge preserved in Wikimedia projects. Humanitarian mapping is more actual than ever, affected communities are everywhere, nearby, making it easy to step into action. Governmental open geodata is more advanced in the Nordic countries than anywhere in the world. We can make something unique with what we collectively can.

The meetup day hosts lightning talks to get to know what’s happening in the Nordic open geodata community and hands-on workshops to familiarize with tools and technologies. In the open part you can show your work and give a talk, or we can set up collaborations for the weekend or further on. We’ll end the day by discussing ways of working together.

#hack4no has also excellent data journalism mapping lectures and workshops running parallel with the meetup. You are welcome to mix between the Nordic open geodata meetup and the data journalism sessions freely.

See you in Hønefoss!

HKMS000005 km003wu4

Historical Pasila – steps towards historical street view

The Digital Humanities hackathon was arranged by University of Helsinki & Aalto University for the second time 16–20 May 2016. The educational event aimed to bring together students and researchers of humanities, social sciences and computer science, for a week of active co-operation in groups under the heading of Digital Humanities.

Exploring the Changing Helsinki

I joined the group who investigated the temporal change of Helsinki through open data and maps provided by Helsinki Region Infoshare, along with cityscape photography by Helsinki City Museum, served through Finna.fi. My fellow investigators were Jussi Kulonpalo (Univ. Helsinki, Social Research/Urban Studies), Timo Laine (Univ. Helsinki, Moral and Social Philosophy), Kari Laine (Aalto University, Computer Networks), Antti Härkönen (Univ. Eastern Finland, General History), Kaisa Sarén (Univ. Helsinki, Bioinformatics), Maija Paavolainen (Univ. Helsinki, Library Sciences) & Rauli Laitinen (Aalto University, Built Environment/ Urban Academy).

Elannon uusi Pasilan liiketalo, jossa oli sekatavara- ja leipämyymälä Hakli Kari, Valokuvaaja 1973 Helsingin kaupunginmuseo Elannon uusi Pasilan liiketalo, jossa oli sekatavara- ja leipämyymälä. Hertankatu 15 (= Pasilan puistotie, Hertanmäki). Oikealla Hertankatu 13, jossa 1970 lopettanut jalkinekorjaamo. — negatiivi, filmi, mv

Pasila, Länsi-Pasila Alanco Jan HKM, Valokuvaaja 8.6.1982 Helsingin kaupunginmuseo Pasila, Länsi-Pasila. Pasilanraitio 13, Pasilan poliisitalo. Kuvattu Radiokadulta Leanportin suuntaan. Taustalla Hertanmäen puutalot. — negatiivi ja vedos, filmi paperi, mv

Puu-Pasila

We chose Western Pasila as a neighbourhood we wanted to observe. It is the last wooden village that has been torn down to make way for a completely new housing project. At the time, it was also the last remaining freely built working class neighbourhood, dating back to the turn of the 20th century. The area had been under threat of demolition since the 1940’s, which had led to negligence and deterioration of living conditions.

finnaSearching images

Finna has an impressive collection of openly licensed photographs from the area, taken mainly by photographers of the Helsinki City Museum in a pursuit of documenting the disappearing environment. The photographs do not have exact coordinate location in their metadata, and Timo Laine, who also works for Finna, proposed to use the street name tags to find all Pasila images instead. To get a list of the names, we digitized 2 publications of Helsinki historical street names and used the list to query Finna images.

Reverend Väinö Kantele – Architect Reijo Jallinoja index

Timo came up with a method of evaluating the images by the nature of their keywords. The elements of change were positioned on an axis: old–new, wood–concrete, soft–hard, past–future, freedom–control, community–family unit, Reverend Kantele–Reijo Jallinoja (Reverend Väinö Kantele was a missionary who decided to devote his life to aiding the poor people of Pasila, while Reijo Jallinoja was the architect of the new Western Pasila). Keywords were assigned to one or the other category, and this produced an index number for each image.

Charting the area

I started working towards a street view experiment. First we needed to create a map of the area.

I took a 1969 aerial image from HRI to work with. I used the Wikimaps workflow to do that: I uploaded it to Wikimedia Commons, added the Map template to it and used Wikimaps Warper to georeference it. I was lucky enough to find a few houses that remained from before the development for compositing the then and now images. The file exists also as a readily georeferenced GeoTIFF.

Näyttökuva 2016-05-16 kello 23.53.48

Making a historical map

OpenHistoricalMap is the perfect tool and environment for historical mapping, even though the project is not yet stable. It has all the same tools as OpenStreetMap for creating a map. Features are tagged with start and end dates and that will allow setting and displaying map features on a timeline (in the future).

Näyttökuva 2016-05-24 kello 18.13.14

Data conversion

Turning the map data in OpenHistoricalMap into something that other environments can use is difficult. After trying out many unsupported, broken and obsolete workflows, I found a working solution.

For exporting the data using the OpenStreetMap desktop editor JOSM works nicely. Albin Larsson has written a useful blog post to encompass in making the right settings. OHM’s own export is currently dysfunctional.

For converting the .osm file format into GeoJSON or KML, it’s possible to use Overpass Turbo for OpenHistoricalMap. It requires learning the query language.

Geolocating the images

Historical street view vistas should be easy and painless to create for volunteer contributors. I keep looking for a workflow that is open, easy, collaborative and productive.

I took a test set of 12 images of the same building from different angles over a few years period.

 

Näyttökuva 2016-05-25 kello 9.43.33

Ajapaik

Ajapaik project would be my option for crowdsourcing the locations and directions of the images. This time there was no time for a collaborative effort, so I followed a workflow that was tested in Maptime Copenhagen together with Mapillary.

geosetter

Geosetter

Geosetter is a great application (Windows only) for setting the location and heading of images, and writing the data in the EXIF metadata section of the image.

It does not seem possible to load a historical map layer into Geosetter, which would be not only useful but essential in order to geolocate historical images.

Google Earth

The advantage of Google Earth is the display of 3D terrain and buildings. I planned to import the converted historical map into SketchUp and define height to the buildings, but I was not able to convert the file to a format SketchUp would understand. It would be possible to use Google Earth to overlay the image onto a 3D view and be able to place it correctly. This workflow should be ported to an open platform.

The street view in Mapillary

After the files were prepared with proper EXIF data, they were ready to be imported into Mapillary. In this test I stopped here. The series of images did not yet resemble a street view experience. There are controls I can use to tie the images correctly together. MapillaryJS is available for further tinkering. I will look into them for the next blog post.

Näyttökuva 2016-05-25 kello 12.51.45

The street view experiments can continue in workshops during the summer. There will be one in Wikimania in Esino Lario, Italy 22–26 June and another one in the Second Swiss Cultural Hackathon in Basel 1–2 July. I will also bring this workflow into discussion in the Beyond the Basics: What Next for Crowdsourcing workshop in the DH2016 conference in Krakow 11–16 July.

More

Apologies for our group for not presenting all our findings! The presentation will give more insight into that. Thank you for inspiring collaboration during the week!

Thank you for the organizers and participants in other groups for an intensive and energetic week with Digital Humanities hacking!

Introducing MetaPipe

update: MetaPipe was renamed as GLAMpipe

Background

I have previously made a tool called Flickr2GWToolset. It is a simple tool for editing metadata of Flickr images and exporting this data to XML file for GLAM-Wiki Toolset. The tool was aimed mainly for GLAM collections metadata. As you can see below, the user interface of Flickr2GWToolset is rather complicated.

The lesson learned from that project was that the problem with designing this kind of tool is how to make all the functionality available to the user without scaring the user. The user interface becomes complicated and adding new features makes it even more complicated.

flickr2gwtoolset_shot

Flickr2GWToolset user interface

For me, it seems obvious that extending user interfaces like seen above to include more and more functionality is a dead end (or, it would require a super-talented designer). Still, even if there was such designer, one fundamental problem remains.

The remaining problem is that, after the metadata is processed, there is no any clue about what was done with the data. The history of actions is not there. If someone asks “what did you do with the metadata?”, then one can just try to remember what were the actual steps. What if I could just show what I did? Or even better, re-run my process with different dataset?

At this point programmers raise their hands and say: “Just write scripts and then you can process any number of datasets”. That is true. Still, this approach has some problems.

The first one is obvious. How to write scripts if you are not a programmer? Second problem is re-usability. When someone writes scripts for example for processing metadata for a Wikimedia Commons upload, the results are often “hack until it works” type of scripts. This means awkward hacks, hardcoded values and no documentation (at least this is how my scripts look like). This makes re-using other programmers’ scripts very difficult, and people keep re-inventing the wheel over and over again.

Third problem is more profound. This is related to the origin of data and I’ll deal with it in next chapter.

Collection metadata vs. machine data

When speaking of tools for (meta)data manipulation, it important to define what kind of data we are dealing with. I make here a distinction between machine[c] data and collection data .

A server log file (a file that can tell what web pages are viewed and when, for example) is machine data. It is produced by computer, it has consistent structure *and* content.  You can rely on that structure when you are manipulating the data. If there is a date field, then there is date in certain format with no exceptions. When processing is needed, a script is created, it is tested, and finally executed. The data is now processed. There is no need to edit this data by hand in any point. Actually, hand editing would endanger the reliability of the data.

On the contrary, collection data has “human nature”. It is produced by humans during some time period. This time period can include several changes in a way data was produced and structured. When this data is made publicly accessible, it has usually consistent structure but there might be various inconsistencies in the content structure and semantics.

For example, “author” can contain name or names, but it can also contain dates of birth or death, or it can even contain descriptions about authors. Or “description” field can contain just few words or it can include all the possible information about the target which could not be fitted in anywhere else in the data structure (and that should be placed somewhere else in upload).

This kind of data can be an algorithmic nightmare. There are special cases and special cases of special cases, and it would be almost impossible to make an algorithm that could deal every one of them. Often you can deal with 90 or 99 percent of cases. For the rest it might be  easiest to just edit data manually.

When working with this kind of data, it is important that one can make manual edits during the process which is difficult when data is processed with scripts only.

MetaPipe GLAMpipe by WMFI

GLAMPipe (we are searching better name) relies the concepts of visual programming and node-based editing on its user interface. Both are based on visual blocks that can be added, removed and re-arranged by the user. The result is both a visual presentation and an executable program. Node-based user interface is not a new idea, but for some cases it is a good idea. There is a good analysis of node-based (actually flow-based which is a little different thing) here: http://bergie.iki.fi/blog/inspiration-for-fbp-ui/. Below you can see an example of visual programming with Scratch. Can you find out what happens when you click the cat?

Visual programming with Scratch. Image by scratch.mit.edu, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=35453328

Visual programming with Scratch. Image by scratch.mit.edu, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=35453328

GLAMpipe combines visual programming and node-based editing very loosely. The result of using nodes in GLAMpipe is a program that processes data in a certain way.  You can think nodes as modifiers or scripts that are applied to a current dataset.

A simple MetaPipe project.

A simple GLAMpipe project.

Above you can see a screenshot of GLAMpipe showing a simple project. There is a Flickr source node (blue), which brings data to collection (black). Then there are two transform nodes (brownish). The first one extracts the year from the “datetaken” field of the data and puts the result to a field called “datetaken_year”. The second one combines “title” and extracted year with comma and saves the result to a new field called “commons_title”.

Here is one record after the transforms nodes have been executed:

ORIGINAL title: “Famous architect feeding a cat”
ORIGINAL datetaken: 1965-02-01 00:00:00
NEW datetaken_year: 1965
NEW commons_title: “Famous architect feeding a cat, 1965”

Note that the original data remains intact. This means one can re-run transform nodes any number of times. Let’s say that you want to have the year in the commons title inside brackets like this: “Famous architect feeding a cat (1965)”. You can add brackets around “datetaken_year” in transform node’s settings. Then just re-run the node and new values are written to “commons_title” field.

The nodes have their own parameters and settings. This means that all information about editing process is packed in projects node-setup. Now, if someone asks “How did you create commons title names for your data?”  I can share this setup. And even more, one can change the dataset and re-run nodes by replacing source node with a new source node with similar data structure. So if one want to process some other Flickr image album, this can be done with replacing source node which points to different album.

However, we are still fiddling with different kind of options how UI should work.

Nodes

Nodes are the building blocks of data manipulation in GLAMpipe. Nodes can import data, transform data, download files, upload content or export content to a certain format.

Some examples of currently available nodes:

  • Flickr source node. It needs an Flickr API key (which is free) and the album id. When executed, it imports data to the collection.
  • File source node. Node reads data from file and imports it to the collection. Currently it accepts data in CSV or TSV formats.
  • wikitext transform node. This maps your data fields to Photograph or Map template and writes wikitext to a new field to the collection.
  • Flickr image url lookup node. This will fetch urls for different image sizes from Flickr and writes info to collection
  • Search and replace transform node. This nodes searches string, replaces it and writes result back to collection by creating a new field.

Below is a screencast of using georeferencer node. Georeferencer is a view node. View node is basically a web page, that can fetch and alter data via GLAMpipe API.

Technically nodes are json files, that include several scripts. You can find in more depth information here: https://github.com/artturimatias/metapipe-nodes

How to test?

GLAMpipe is work in progress state. Do not expect things just to work.

Installing GLAMpipe currently requires some technical skills. GLAMpipe is a server software but only way to test it now is to install it to your own computer. There are directions for installation on Linux. For other operating systems installation should be also possible, but that is not tested.

https://github.com/artturimatias/GLAMpipe

 

My assignment (Ari) in Wikimaps 2.0 project is to improve a map upload process. This is done by adjusting separate tool development project by WMFI called MetaPipe (working title) so that would help map handling. The development of the tool is funded by Finnish Ministry of Education and Culture.

Wikimaps on Maps Mania: Maps in the Age of Cholera & The Vintage Maps of Berlin

The wonderful, prolific and very popular Maps Mania blog  featured the Wikimaps Warper a few times recently, do check them out!

The first interactive map: The Vintage Maps of Berlin uses the Wikimaps Warper.

mapsmania

Keir writes:

This collection of old historical maps of Berlin centers around what is now Museum Island in Berlin.

In the oldest maps you can clearly see the two towns of Cölln and Altberlin, on opposite banks of the River Spree. As you progress through the maps you can see how this area of Berlin’s has changed and developed over the centuries.

Do check out the 11 maps of Berlin from 1652 to today  here: http://homepage.ntlworld.com/keir.clarke/leaflet/berlin.htm

The second post, and interactive map entitled Maps in the Age of Cholera based on an epidemiological map of Leeds (co-incidentally my home).

logo (1)

This was also created by Keir and he writes:

Twenty Years before John Snow famously mapped the locations of cholera victims in Broad Street, London, Robert Baker plotted the deaths of cholera victims in Leeds. Maps in the Age of Cholera is a story map based around Robert Baker’s ‘Sanitary Map of the Town of Leeds’ exploring the 1832 cholera epidemic in the Yorkshire town. Baker never made the link between cholera and contaminated water. However, in his map and in the accompanying report to the Leeds Board of Health, Baker noted that “the disease was worst in those parts of the town where there is often an entire want of sewage, drainage and paving”.

The map itself uses this Leaflet Story Map plug-in. The Leaflet Story Map library uses jQuery to create a scroll driven story map. The map tiles scheme for Robert Baker’s 1832 ‘Sanitary Map of the Town of Leeds’ comes from Wikimaps Warper.

do go check out the interactive story map here http://homepage.ntlworld.com/keir.clarke/leaflet/cholera.htm

 

 

 

Table_of_Surveying,_Cyclopaedia,_Volume_2

Wikimaps Warper 2.0

Tim Waters, Susanna Ånäs, Albin Larsson and Ari Häyrinen received an IEG (Individual Engagement Grant) from the Wikimedia Foundation to work on a renewed Warper for the Wikimaps project.

We have observed that if the Warper experience was easier and more pleasant, we would attract more volunteers in warping old maps.

In order to let some developers work on the program features and others on the user interface, we would need to separate program components, decouple them. The program could serve any external application through the API. The Warper’s own user interface would be one of the program’s clients, and the Warper program would be dogfooding the Warper interface.

New features

We also have neat new features that we had already started to work on. Stay tuned for specific blog posts about them!

  • Importing a Commons category of maps to the Warper.
  • Importing control points for already georeferenced maps for single maps as well as batches
  • The most wanted from the feature wish list

Design

We are focusing only on the program features in this project, but we are guided by the envisioned design of the tool. Have a look and comment the evolving mockup doc.

Finding maps

Viewing

Viewing several maps

Rectifying

Cropping

Logic model

What we put in, who we address, what comes out in short, medium and long range. Do you agree? Have a look and comment!

What to expect

We will arrange focused meetings with advisors regarding specific aspects of the development, and make documentation about the meetings available. Additionally, there will be a public Hangout session open for everyone. If you wish to participate, enroll as volunteer or leave a note on the project page.

About us

Tim Waters – back end. Tim is the original developer of the MapWarper software. It was originally created for the New York Public Library, where it serves thousands of openly available maps. The open source code has spawned MapWarpers around the globe.

Albin Larsson – front end. Albin has worked on the iD editor for OpenHistoricalMap, a sister project for the Wikimaps project. He has made a proposal for a linked data approach to OpenHistoricalMap data and created a prototype of modeling historical buildings in 3D.

Ari Häyrinen – integrations. Ari is developing a metadata enhancement tool to be used with GLAM mass uploads in a Wikimedia Finland project. The project will link between copies of files in different repositories to take advantage of spatial mapping possibilities offered by some of them.

Susanna Ånäs – coordination, design. Susanna has run the Wikimaps project and network over a two-year period. She will contribute to the design aspects, but mainly her task is to facilitate communication and participation in the project.

Follow the project via these

All of Wikimaps Nordic in a report

The latest historical map to Wikimedia Commons, Nova et aucta orbis terrae descriptio ad usum navigantium emendate accomodata by Gerhard Mercator, has been uploaded today. There are now 15 620 maps with the Map template in Wikimedia Commons!

Wikimaps Nordic project has now officially been completed. Nordic Culture Fund and Wikimedia Foundation have supported bringing mapping tools for historical maps to Wikimedia Commons, and creating maps activities especially in the Nordic countries. We welcome you to view the project report and to comment and propose new directions!

The work continues with the Wikimaps Warper 2.0 project, funded by the Wikimedia Foundation. There are many more directions to follow, and here are short teasers for some of the ideas. What do you think? Comment here, in the Facebook group, or in the Commons project page.

Historical place task force

We would like to invite all wikidatans and linked data buffs to discuss how data about historical places should be modelled in Wikidata.

Enhancing the workflow from maps uploads to a world map of history

Facilitating uploading maps to Wikimedia Commons, linking maps and historical data about places, using the maps and data in Wikimedia projects and working together with OpenHistoricalMap to store the world map of history.

More tools, more stories

We wish we can work to bring more tools together to help finding and interpreting open historical documents. With the help of the tools it should be possible to enrich Wikimedia content, but also make original research that could be used in further research, hacks, apps, stories and more.

Imagine finding an old picture of a place or people and unfolding the story behind it with the help of thousands of volunteers interested in the same things. Imagine you could continue gathering context to the story with the help of maps, images and documents from museums and archives as well as pictures and letters in your own albums and shoeboxes. Let’s work together on making it possible in the open environments!

Finnmarkens amt nr 13- Wardøehuus Festning udi Grundritz, 1793“. – This file is digitized and shared in Wikimedia Commons by Kartverket. Licensed under CC BY 4.0 via Wikimedia Commons.

Old maps of Jerusalem released

Kidron Monuments 1868” by Alexander von Wartensleben-Schwirsen – National Library of Israel, Eran Laor Cartographic Collection. Licensed under CC BY 3.0 via Wikimedia Commons.

The National Library of Israel has released a collection of 200 high-resolution unique maps of Jerusalem in collaboration with Wikimedia Israel. This collection of ancient maps, spanning from 1486 to 1947, contains a variety of styles and languages.

To celebrate this, Wikimedia Israel will hold an editathon / mapathon in the National Library. It will be a social event to create, update, and improve Wikipedia articles about maps, cartographers, and locations. The editathon will take place on December 15th and will include a guided tour in the rare map collection of the National Library.

Wikimedia Israel encourages the Wikimaps community to use the maps in Wikimedia initiatives and other open source projects and wishes for a wonderful Holiday Season and a Happy New Year!

DSC00237

Public art from streets to the net

DroneArt Helsinki! was an event organized in collaboration with Wikimedia Finland, Maptime Helsinki and AvoinGLAM, where we experimented with bringing public artworks and statues into the open through Wikimedia sites. At the same time Wikimedia Sweden’s Jan Ainali, John Andersson and André Costa were visiting us. The aim was to phtograph statues with drones, place them on a map and model in 3D.

Little seminar

We put together a great program of presenting  public art in Wikimedia. Heikki Kastemaa opened the event by describing writing about public art in Wikipedia. According to him, there is no established method for it, but often articles about public art contain some or all of the following: description of the artwork, the provenance and reception. Images of the artworks are intended to communicate information about the works.

John Andersson told about the Wiki Loves Public Art project that was initiated by Wikimedia Sweden. Thanks to the project, our colleagues in Sweden have been able to gather a public art database covering the whole country. André Costa was presenting productions that were created around the project.

The database has been turned into the Offentligkonst.se map service. In another project images from Wiki Loves Monuments and Mapillary have been brought together.

Copyright of public art

Finland and Sweden have basically similar copyright legislation based on the pan-European practice. A work of art is protected with copyright 70 years from the death of the artist. Replication of the work, images or 3D models are not allowed to be distributed freely. Works placed permanently in the public space are an exception, but this exception is dealt with slight differences in different EU countries.

The exception is called the Freedom of Panorama. In Finland all artworks placed permanently in public space or in it’s vicinity are allowed to be photographed freely, but the images must not be used commercially.

In Sweden the copyright organization BUS has required compensation from Wikimedia Sweden for publishing images in the Internet, based on the claim that the database can be used free of charge and the distribution of images is in large scale compared to traditional printing of postcards. Wikimedia Sverige and BUS case is in Supreme Court. (Mikael Mildén, Konstnärer stämmer Wikimedia, Fria Tidningen 26.3.2015)

In addition, the images stored in Wikimedia must comply to the copyright legislation in the originating country and the United States. Because of this even more images fall outside Wikimedia Commons.

In the Finnish Wikipedia images are stored locally and they can be used according to Finnish copyright law based on the right to quote, in the respective article of the artwork.

Represenatives of the Wikimedia movement strive to affect the current changes in copyright legislation in Brussels, and one of the most important reforms is the unification of freedom of panorama. Local representatives are welcome for this work.

Unleash the drones!

We wanted to try out many different ways to photograph the statues to create the 3D models.

The statues is photographed from all sides with many images. A drone or a long monopog helps in reaching the heights. All corners of the statue must be explored with different shooting angles.

The weather forecast promised stormy wind and rain. There were clouds in the sky. The drone was tossed around by the wind, but finally we went happily back to our workshop with plenty of images.

10155677_10153081888303143_3673792656572381993_n

As a result of working with different Structure from Motion programs, we were happy to note that the stormy clouds were not the only clouds: we managed to create a point cloud! A point cloud is a computer generated 3D model that consists of points situated in 3D space. Comparing different images taken from different angles, matching points can be found and they are mapped into 3D space quite as mystically as surveyors reveal distances.

The point cloud was further elaborated into a polygon mesh, out of which a 3D printer was able to create an object. We shall attach the image here as soon as we get it. The copyright of the work has expired. The artist is Bertel Nilsson, and the work is Eagles, made in 1913.