HKMS000005 km003wu4

Historical Pasila – steps towards historical street view

The Digital Humanities hackathon was arranged by University of Helsinki & Aalto University for the second time 16–20 May 2016. The educational event aimed to bring together students and researchers of humanities, social sciences and computer science, for a week of active co-operation in groups under the heading of Digital Humanities.

Exploring the Changing Helsinki

I joined the group who investigated the temporal change of Helsinki through open data and maps provided by Helsinki Region Infoshare, along with cityscape photography by Helsinki City Museum, served through Finna.fi. My fellow investigators were Jussi Kulonpalo (Univ. Helsinki, Social Research/Urban Studies), Timo Laine (Univ. Helsinki, Moral and Social Philosophy), Kari Laine (Aalto University, Computer Networks), Antti Härkönen (Univ. Eastern Finland, General History), Kaisa Sarén (Univ. Helsinki, Bioinformatics), Maija Paavolainen (Univ. Helsinki, Library Sciences) & Rauli Laitinen (Aalto University, Built Environment/ Urban Academy).

Elannon uusi Pasilan liiketalo, jossa oli sekatavara- ja leipämyymälä Hakli Kari, Valokuvaaja 1973 Helsingin kaupunginmuseo Elannon uusi Pasilan liiketalo, jossa oli sekatavara- ja leipämyymälä. Hertankatu 15 (= Pasilan puistotie, Hertanmäki). Oikealla Hertankatu 13, jossa 1970 lopettanut jalkinekorjaamo. — negatiivi, filmi, mv

Pasila, Länsi-Pasila Alanco Jan HKM, Valokuvaaja 8.6.1982 Helsingin kaupunginmuseo Pasila, Länsi-Pasila. Pasilanraitio 13, Pasilan poliisitalo. Kuvattu Radiokadulta Leanportin suuntaan. Taustalla Hertanmäen puutalot. — negatiivi ja vedos, filmi paperi, mv

Puu-Pasila

We chose Western Pasila as a neighbourhood we wanted to observe. It is the last wooden village that has been torn down to make way for a completely new housing project. At the time, it was also the last remaining freely built working class neighbourhood, dating back to the turn of the 20th century. The area had been under threat of demolition since the 1940’s, which had led to negligence and deterioration of living conditions.

finnaSearching images

Finna has an impressive collection of openly licensed photographs from the area, taken mainly by photographers of the Helsinki City Museum in a pursuit of documenting the disappearing environment. The photographs do not have exact coordinate location in their metadata, and Timo Laine, who also works for Finna, proposed to use the street name tags to find all Pasila images instead. To get a list of the names, we digitized 2 publications of Helsinki historical street names and used the list to query Finna images.

Reverend Väinö Kantele – Architect Reijo Jallinoja index

Timo came up with a method of evaluating the images by the nature of their keywords. The elements of change were positioned on an axis: old–new, wood–concrete, soft–hard, past–future, freedom–control, community–family unit, Reverend Kantele–Reijo Jallinoja (Reverend Väinö Kantele was a missionary who decided to devote his life to aiding the poor people of Pasila, while Reijo Jallinoja was the architect of the new Western Pasila). Keywords were assigned to one or the other category, and this produced an index number for each image.

Charting the area

I started working towards a street view experiment. First we needed to create a map of the area.

I took a 1969 aerial image from HRI to work with. I used the Wikimaps workflow to do that: I uploaded it to Wikimedia Commons, added the Map template to it and used Wikimaps Warper to georeference it. I was lucky enough to find a few houses that remained from before the development for compositing the then and now images. The file exists also as a readily georeferenced GeoTIFF.

Näyttökuva 2016-05-16 kello 23.53.48

Making a historical map

OpenHistoricalMap is the perfect tool and environment for historical mapping, even though the project is not yet stable. It has all the same tools as OpenStreetMap for creating a map. Features are tagged with start and end dates and that will allow setting and displaying map features on a timeline (in the future).

Näyttökuva 2016-05-24 kello 18.13.14

Data conversion

Turning the map data in OpenHistoricalMap into something that other environments can use is difficult. After trying out many unsupported, broken and obsolete workflows, I found a working solution.

For exporting the data using the OpenStreetMap desktop editor JOSM works nicely. Albin Larsson has written a useful blog post to encompass in making the right settings. OHM’s own export is currently dysfunctional.

For converting the .osm file format into GeoJSON or KML, it’s possible to use Overpass Turbo for OpenHistoricalMap. It requires learning the query language.

Geolocating the images

Historical street view vistas should be easy and painless to create for volunteer contributors. I keep looking for a workflow that is open, easy, collaborative and productive.

I took a test set of 12 images of the same building from different angles over a few years period.

 

Näyttökuva 2016-05-25 kello 9.43.33

Ajapaik

Ajapaik project would be my option for crowdsourcing the locations and directions of the images. This time there was no time for a collaborative effort, so I followed a workflow that was tested in Maptime Copenhagen together with Mapillary.

geosetter

Geosetter

Geosetter is a great application (Windows only) for setting the location and heading of images, and writing the data in the EXIF metadata section of the image.

It does not seem possible to load a historical map layer into Geosetter, which would be not only useful but essential in order to geolocate historical images.

Google Earth

The advantage of Google Earth is the display of 3D terrain and buildings. I planned to import the converted historical map into SketchUp and define height to the buildings, but I was not able to convert the file to a format SketchUp would understand. It would be possible to use Google Earth to overlay the image onto a 3D view and be able to place it correctly. This workflow should be ported to an open platform.

The street view in Mapillary

After the files were prepared with proper EXIF data, they were ready to be imported into Mapillary. In this test I stopped here. The series of images did not yet resemble a street view experience. There are controls I can use to tie the images correctly together. MapillaryJS is available for further tinkering. I will look into them for the next blog post.

Näyttökuva 2016-05-25 kello 12.51.45

The street view experiments can continue in workshops during the summer. There will be one in Wikimania in Esino Lario, Italy 22–26 June and another one in the Second Swiss Cultural Hackathon in Basel 1–2 July. I will also bring this workflow into discussion in the Beyond the Basics: What Next for Crowdsourcing workshop in the DH2016 conference in Krakow 11–16 July.

More

Apologies for our group for not presenting all our findings! The presentation will give more insight into that. Thank you for inspiring collaboration during the week!

Thank you for the organizers and participants in other groups for an intensive and energetic week with Digital Humanities hacking!

Introducing MetaPipe

update: MetaPipe was renamed as GLAMpipe

Background

I have previously made a tool called Flickr2GWToolset. It is a simple tool for editing metadata of Flickr images and exporting this data to XML file for GLAM-Wiki Toolset. The tool was aimed mainly for GLAM collections metadata. As you can see below, the user interface of Flickr2GWToolset is rather complicated.

The lesson learned from that project was that the problem with designing this kind of tool is how to make all the functionality available to the user without scaring the user. The user interface becomes complicated and adding new features makes it even more complicated.

flickr2gwtoolset_shot

Flickr2GWToolset user interface

For me, it seems obvious that extending user interfaces like seen above to include more and more functionality is a dead end (or, it would require a super-talented designer). Still, even if there was such designer, one fundamental problem remains.

The remaining problem is that, after the metadata is processed, there is no any clue about what was done with the data. The history of actions is not there. If someone asks “what did you do with the metadata?”, then one can just try to remember what were the actual steps. What if I could just show what I did? Or even better, re-run my process with different dataset?

At this point programmers raise their hands and say: “Just write scripts and then you can process any number of datasets”. That is true. Still, this approach has some problems.

The first one is obvious. How to write scripts if you are not a programmer? Second problem is re-usability. When someone writes scripts for example for processing metadata for a Wikimedia Commons upload, the results are often “hack until it works” type of scripts. This means awkward hacks, hardcoded values and no documentation (at least this is how my scripts look like). This makes re-using other programmers’ scripts very difficult, and people keep re-inventing the wheel over and over again.

Third problem is more profound. This is related to the origin of data and I’ll deal with it in next chapter.

Collection metadata vs. machine data

When speaking of tools for (meta)data manipulation, it important to define what kind of data we are dealing with. I make here a distinction between machine[c] data and collection data .

A server log file (a file that can tell what web pages are viewed and when, for example) is machine data. It is produced by computer, it has consistent structure *and* content.  You can rely on that structure when you are manipulating the data. If there is a date field, then there is date in certain format with no exceptions. When processing is needed, a script is created, it is tested, and finally executed. The data is now processed. There is no need to edit this data by hand in any point. Actually, hand editing would endanger the reliability of the data.

On the contrary, collection data has “human nature”. It is produced by humans during some time period. This time period can include several changes in a way data was produced and structured. When this data is made publicly accessible, it has usually consistent structure but there might be various inconsistencies in the content structure and semantics.

For example, “author” can contain name or names, but it can also contain dates of birth or death, or it can even contain descriptions about authors. Or “description” field can contain just few words or it can include all the possible information about the target which could not be fitted in anywhere else in the data structure (and that should be placed somewhere else in upload).

This kind of data can be an algorithmic nightmare. There are special cases and special cases of special cases, and it would be almost impossible to make an algorithm that could deal every one of them. Often you can deal with 90 or 99 percent of cases. For the rest it might be  easiest to just edit data manually.

When working with this kind of data, it is important that one can make manual edits during the process which is difficult when data is processed with scripts only.

MetaPipe GLAMpipe by WMFI

GLAMPipe (we are searching better name) relies the concepts of visual programming and node-based editing on its user interface. Both are based on visual blocks that can be added, removed and re-arranged by the user. The result is both a visual presentation and an executable program. Node-based user interface is not a new idea, but for some cases it is a good idea. There is a good analysis of node-based (actually flow-based which is a little different thing) here: http://bergie.iki.fi/blog/inspiration-for-fbp-ui/. Below you can see an example of visual programming with Scratch. Can you find out what happens when you click the cat?

Visual programming with Scratch. Image by scratch.mit.edu, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=35453328

Visual programming with Scratch. Image by scratch.mit.edu, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=35453328

GLAMpipe combines visual programming and node-based editing very loosely. The result of using nodes in GLAMpipe is a program that processes data in a certain way.  You can think nodes as modifiers or scripts that are applied to a current dataset.

A simple MetaPipe project.

A simple GLAMpipe project.

Above you can see a screenshot of GLAMpipe showing a simple project. There is a Flickr source node (blue), which brings data to collection (black). Then there are two transform nodes (brownish). The first one extracts the year from the “datetaken” field of the data and puts the result to a field called “datetaken_year”. The second one combines “title” and extracted year with comma and saves the result to a new field called “commons_title”.

Here is one record after the transforms nodes have been executed:

ORIGINAL title: “Famous architect feeding a cat”
ORIGINAL datetaken: 1965-02-01 00:00:00
NEW datetaken_year: 1965
NEW commons_title: “Famous architect feeding a cat, 1965”

Note that the original data remains intact. This means one can re-run transform nodes any number of times. Let’s say that you want to have the year in the commons title inside brackets like this: “Famous architect feeding a cat (1965)”. You can add brackets around “datetaken_year” in transform node’s settings. Then just re-run the node and new values are written to “commons_title” field.

The nodes have their own parameters and settings. This means that all information about editing process is packed in projects node-setup. Now, if someone asks “How did you create commons title names for your data?”  I can share this setup. And even more, one can change the dataset and re-run nodes by replacing source node with a new source node with similar data structure. So if one want to process some other Flickr image album, this can be done with replacing source node which points to different album.

However, we are still fiddling with different kind of options how UI should work.

Nodes

Nodes are the building blocks of data manipulation in GLAMpipe. Nodes can import data, transform data, download files, upload content or export content to a certain format.

Some examples of currently available nodes:

  • Flickr source node. It needs an Flickr API key (which is free) and the album id. When executed, it imports data to the collection.
  • File source node. Node reads data from file and imports it to the collection. Currently it accepts data in CSV or TSV formats.
  • wikitext transform node. This maps your data fields to Photograph or Map template and writes wikitext to a new field to the collection.
  • Flickr image url lookup node. This will fetch urls for different image sizes from Flickr and writes info to collection
  • Search and replace transform node. This nodes searches string, replaces it and writes result back to collection by creating a new field.

Below is a screencast of using georeferencer node. Georeferencer is a view node. View node is basically a web page, that can fetch and alter data via GLAMpipe API.

Technically nodes are json files, that include several scripts. You can find in more depth information here: https://github.com/artturimatias/metapipe-nodes

How to test?

GLAMpipe is work in progress state. Do not expect things just to work.

Installing GLAMpipe currently requires some technical skills. GLAMpipe is a server software but only way to test it now is to install it to your own computer. There are directions for installation on Linux. For other operating systems installation should be also possible, but that is not tested.

https://github.com/artturimatias/GLAMpipe

 

My assignment (Ari) in Wikimaps 2.0 project is to improve a map upload process. This is done by adjusting separate tool development project by WMFI called MetaPipe (working title) so that would help map handling. The development of the tool is funded by Finnish Ministry of Education and Culture.

Wikimaps on Maps Mania: Maps in the Age of Cholera & The Vintage Maps of Berlin

The wonderful, prolific and very popular Maps Mania blog  featured the Wikimaps Warper a few times recently, do check them out!

The first interactive map: The Vintage Maps of Berlin uses the Wikimaps Warper.

mapsmania

Keir writes:

This collection of old historical maps of Berlin centers around what is now Museum Island in Berlin.

In the oldest maps you can clearly see the two towns of Cölln and Altberlin, on opposite banks of the River Spree. As you progress through the maps you can see how this area of Berlin’s has changed and developed over the centuries.

Do check out the 11 maps of Berlin from 1652 to today  here: http://homepage.ntlworld.com/keir.clarke/leaflet/berlin.htm

The second post, and interactive map entitled Maps in the Age of Cholera based on an epidemiological map of Leeds (co-incidentally my home).

logo (1)

This was also created by Keir and he writes:

Twenty Years before John Snow famously mapped the locations of cholera victims in Broad Street, London, Robert Baker plotted the deaths of cholera victims in Leeds. Maps in the Age of Cholera is a story map based around Robert Baker’s ‘Sanitary Map of the Town of Leeds’ exploring the 1832 cholera epidemic in the Yorkshire town. Baker never made the link between cholera and contaminated water. However, in his map and in the accompanying report to the Leeds Board of Health, Baker noted that “the disease was worst in those parts of the town where there is often an entire want of sewage, drainage and paving”.

The map itself uses this Leaflet Story Map plug-in. The Leaflet Story Map library uses jQuery to create a scroll driven story map. The map tiles scheme for Robert Baker’s 1832 ‘Sanitary Map of the Town of Leeds’ comes from Wikimaps Warper.

do go check out the interactive story map here http://homepage.ntlworld.com/keir.clarke/leaflet/cholera.htm

 

 

 

Table_of_Surveying,_Cyclopaedia,_Volume_2

Wikimaps Warper 2.0

Tim Waters, Susanna Ånäs, Albin Larsson and Ari Häyrinen received an IEG (Individual Engagement Grant) from the Wikimedia Foundation to work on a renewed Warper for the Wikimaps project.

We have observed that if the Warper experience was easier and more pleasant, we would attract more volunteers in warping old maps.

In order to let some developers work on the program features and others on the user interface, we would need to separate program components, decouple them. The program could serve any external application through the API. The Warper’s own user interface would be one of the program’s clients, and the Warper program would be dogfooding the Warper interface.

New features

We also have neat new features that we had already started to work on. Stay tuned for specific blog posts about them!

  • Importing a Commons category of maps to the Warper.
  • Importing control points for already georeferenced maps for single maps as well as batches
  • The most wanted from the feature wish list

Design

We are focusing only on the program features in this project, but we are guided by the envisioned design of the tool. Have a look and comment the evolving mockup doc.

Finding maps

Viewing

Viewing several maps

Rectifying

Cropping

Logic model

What we put in, who we address, what comes out in short, medium and long range. Do you agree? Have a look and comment!

What to expect

We will arrange focused meetings with advisors regarding specific aspects of the development, and make documentation about the meetings available. Additionally, there will be a public Hangout session open for everyone. If you wish to participate, enroll as volunteer or leave a note on the project page.

About us

Tim Waters – back end. Tim is the original developer of the MapWarper software. It was originally created for the New York Public Library, where it serves thousands of openly available maps. The open source code has spawned MapWarpers around the globe.

Albin Larsson – front end. Albin has worked on the iD editor for OpenHistoricalMap, a sister project for the Wikimaps project. He has made a proposal for a linked data approach to OpenHistoricalMap data and created a prototype of modeling historical buildings in 3D.

Ari Häyrinen – integrations. Ari is developing a metadata enhancement tool to be used with GLAM mass uploads in a Wikimedia Finland project. The project will link between copies of files in different repositories to take advantage of spatial mapping possibilities offered by some of them.

Susanna Ånäs – coordination, design. Susanna has run the Wikimaps project and network over a two-year period. She will contribute to the design aspects, but mainly her task is to facilitate communication and participation in the project.

Follow the project via these

All of Wikimaps Nordic in a report

The latest historical map to Wikimedia Commons, Nova et aucta orbis terrae descriptio ad usum navigantium emendate accomodata by Gerhard Mercator, has been uploaded today. There are now 15 620 maps with the Map template in Wikimedia Commons!

Wikimaps Nordic project has now officially been completed. Nordic Culture Fund and Wikimedia Foundation have supported bringing mapping tools for historical maps to Wikimedia Commons, and creating maps activities especially in the Nordic countries. We welcome you to view the project report and to comment and propose new directions!

The work continues with the Wikimaps Warper 2.0 project, funded by the Wikimedia Foundation. There are many more directions to follow, and here are short teasers for some of the ideas. What do you think? Comment here, in the Facebook group, or in the Commons project page.

Historical place task force

We would like to invite all wikidatans and linked data buffs to discuss how data about historical places should be modelled in Wikidata.

Enhancing the workflow from maps uploads to a world map of history

Facilitating uploading maps to Wikimedia Commons, linking maps and historical data about places, using the maps and data in Wikimedia projects and working together with OpenHistoricalMap to store the world map of history.

More tools, more stories

We wish we can work to bring more tools together to help finding and interpreting open historical documents. With the help of the tools it should be possible to enrich Wikimedia content, but also make original research that could be used in further research, hacks, apps, stories and more.

Imagine finding an old picture of a place or people and unfolding the story behind it with the help of thousands of volunteers interested in the same things. Imagine you could continue gathering context to the story with the help of maps, images and documents from museums and archives as well as pictures and letters in your own albums and shoeboxes. Let’s work together on making it possible in the open environments!

Finnmarkens amt nr 13- Wardøehuus Festning udi Grundritz, 1793“. – This file is digitized and shared in Wikimedia Commons by Kartverket. Licensed under CC BY 4.0 via Wikimedia Commons.

Old maps of Jerusalem released

Kidron Monuments 1868” by Alexander von Wartensleben-Schwirsen – National Library of Israel, Eran Laor Cartographic Collection. Licensed under CC BY 3.0 via Wikimedia Commons.

The National Library of Israel has released a collection of 200 high-resolution unique maps of Jerusalem in collaboration with Wikimedia Israel. This collection of ancient maps, spanning from 1486 to 1947, contains a variety of styles and languages.

To celebrate this, Wikimedia Israel will hold an editathon / mapathon in the National Library. It will be a social event to create, update, and improve Wikipedia articles about maps, cartographers, and locations. The editathon will take place on December 15th and will include a guided tour in the rare map collection of the National Library.

Wikimedia Israel encourages the Wikimaps community to use the maps in Wikimedia initiatives and other open source projects and wishes for a wonderful Holiday Season and a Happy New Year!

DSC00237

Public art from streets to the net

DroneArt Helsinki! was an event organized in collaboration with Wikimedia Finland, Maptime Helsinki and AvoinGLAM, where we experimented with bringing public artworks and statues into the open through Wikimedia sites. At the same time Wikimedia Sweden’s Jan Ainali, John Andersson and André Costa were visiting us. The aim was to phtograph statues with drones, place them on a map and model in 3D.

Little seminar

We put together a great program of presenting  public art in Wikimedia. Heikki Kastemaa opened the event by describing writing about public art in Wikipedia. According to him, there is no established method for it, but often articles about public art contain some or all of the following: description of the artwork, the provenance and reception. Images of the artworks are intended to communicate information about the works.

John Andersson told about the Wiki Loves Public Art project that was initiated by Wikimedia Sweden. Thanks to the project, our colleagues in Sweden have been able to gather a public art database covering the whole country. André Costa was presenting productions that were created around the project.

The database has been turned into the Offentligkonst.se map service. In another project images from Wiki Loves Monuments and Mapillary have been brought together.

Copyright of public art

Finland and Sweden have basically similar copyright legislation based on the pan-European practice. A work of art is protected with copyright 70 years from the death of the artist. Replication of the work, images or 3D models are not allowed to be distributed freely. Works placed permanently in the public space are an exception, but this exception is dealt with slight differences in different EU countries.

The exception is called the Freedom of Panorama. In Finland all artworks placed permanently in public space or in it’s vicinity are allowed to be photographed freely, but the images must not be used commercially.

In Sweden the copyright organization BUS has required compensation from Wikimedia Sweden for publishing images in the Internet, based on the claim that the database can be used free of charge and the distribution of images is in large scale compared to traditional printing of postcards. Wikimedia Sverige and BUS case is in Supreme Court. (Mikael Mildén, Konstnärer stämmer Wikimedia, Fria Tidningen 26.3.2015)

In addition, the images stored in Wikimedia must comply to the copyright legislation in the originating country and the United States. Because of this even more images fall outside Wikimedia Commons.

In the Finnish Wikipedia images are stored locally and they can be used according to Finnish copyright law based on the right to quote, in the respective article of the artwork.

Represenatives of the Wikimedia movement strive to affect the current changes in copyright legislation in Brussels, and one of the most important reforms is the unification of freedom of panorama. Local representatives are welcome for this work.

Unleash the drones!

We wanted to try out many different ways to photograph the statues to create the 3D models.

The statues is photographed from all sides with many images. A drone or a long monopog helps in reaching the heights. All corners of the statue must be explored with different shooting angles.

The weather forecast promised stormy wind and rain. There were clouds in the sky. The drone was tossed around by the wind, but finally we went happily back to our workshop with plenty of images.

10155677_10153081888303143_3673792656572381993_n

As a result of working with different Structure from Motion programs, we were happy to note that the stormy clouds were not the only clouds: we managed to create a point cloud! A point cloud is a computer generated 3D model that consists of points situated in 3D space. Comparing different images taken from different angles, matching points can be found and they are mapped into 3D space quite as mystically as surveyors reveal distances.

The point cloud was further elaborated into a polygon mesh, out of which a 3D printer was able to create an object. We shall attach the image here as soon as we get it. The copyright of the work has expired. The artist is Bertel Nilsson, and the work is Eagles, made in 1913.

Wiki Loves Maps seminar and hackathon in Helsinki

Wiki Loves Maps seminar (February 5, 2015) and hackathon (February 6–8, 2015) were arranged in Helsinki. The altogether 4 days brought together a full house on all the days.

Wiki Loves seminar was organized by Wikimedia Suomi in collaboration with the city of Helsinki. Wiki Loves Maps hackathon was part of #Hack4FI – Hack Your Heritage event.

Thanks to our team Teemu, Samppa and Ari, and thanks to the #Hack4FI team Sanna, Laura and Neea, as well as Anna, Arttu and Juhani from Media Factory! And all of you who participated onsite or online!

Documentation

The seminar video stream is available for viewing.
In Flickr there are photographs by Teemu Perhiö / Wikimedia Finland licensed under CC BY-SA 2.0 and Hack4FI photographs by AvoinGLAM licensed with CC NC-BY-SA.
The seminar presentations are shared in a Drive folder.
The #Hack4FI project folder is in Drive.
Wikimaps has a Facebook group. There is now also a new Facebook group for #Hack4FI, where you can connect with others who took part.
The Documentation page in the Wiki Loves Maps site will be updated with new posts.

We appreciate if you fill in the survey. It’ll help justify further events in the future! If you can’t find the email, use this link! Thank you!

Seminar

Historical Aleksanterinkatu

The theme of Wiki Loves Maps was Historical Aleksanterinkatu. It’s a joint initiative with the City of Helsinki to gather historical materials about the city, combine and reuse them. Arend Oudman gave a bird’s eye view to the city through the series of Historical aerial images from the Metropolitan area he has gathered, prepared and opened. Martti Helminen from the City Archives is the originator of the Aleksanterinkatu theme. He took us through the history of the street with images, maps and drawings.

GLAMs working together with Wikimedia
Lars Lundqvist: Introduction to Working togetherAndré Costa: GLAM + Wikimedia Sverige = True

Joonas Loide: GLAM - Wikimedia EestiSanna Hirvonen: Wikimedia Finland ♡ GLAM

Collaboration projects in the Nordic countries between GLAM organizations and Wikimedia chapters.
Lars Lundqvist: Introduction to Working together
André Costa: GLAM + Wikimedia Sverige = True
Joonas Loide: GLAM – Wikimedia Eesti
Sanna Hirvonen: Wikimedia Finland ♡ GLAM

Case studies

Juuso Lehtinen: www.helsinkiennen.fiTimo Korkalainen: Using Augmented Reality for presenting historical sites

Vahur Puik: Ajapaik.ee Timepatch.net Crowdsourcing geotags and rephotos for historic photographs
Encoding and experiencing location in apps.
Juuso Lehtinen: www.helsinkiennen.fi
Timo Korkalainen: Using Augmented Reality for presenting historical sites
Vahur Puik: Ajapaik.ee Timepatch.net Crowdsourcing geotags and rephotos for historic photographs

Keynote: Peter Neubauer, Mapillary

Peter Neubauer: Open Source, Open World, Open Future
Peter NeubauerOpen Source, Open World, Open Future
Peter presented Mapillary, a crowdsourced street view environment. But more than that, he pointed out the necessity of open source and content, and the need to create together. He also came up with the term open past!

Keynote: Mauricio Giraldo, NYPL Labs

Maurico Giraldo: NYPL Labs: NYPL Labs: what we’ve learned in 3 years
Maurico Giraldo: NYPL Labs: what we’ve learned in 3 years
Here are the 4 principles: 1: Start with a prototype. 2: Polish takes time. 3: Everything takes longer than anticipated. 4: Hackathons are starting points.

Cultural hackathons #Hack4DK, #Hack4NO and #Hack4FI


The first ever #Hack4FI carries on the torch of #Hack4DK and #Hack4NO.
Jacob Wang: Hack4DK – Projects and insights
Håvard Johansen: Hack4NO
Sanna Marttila: Hack4FI – Hack Your Heritage

DIY History


We collected a set of speakers to present different approaches to working with local and personal history. These materials have fallen outside publicly funded GLAM endeavours, they are not suited for Wikimedia for their lack of notability and they largely live in commercial services.

Sanna Jokela, Lounaispaikka Developing public services for local history and tourism and possibilities with historical geodata. Picture
Kaisa Kyläkoski, Sukututkijan loppuvuosi Social media as a platform for a DIY historian. Input – output
Kimmo Lehtonen, Albumit Auki!, Lasipalatsin Mediakeskus A project that collects family photographs from old albums. Screenshots, API documentation for #Hack4FI.
Pauliina Latvala, University of Turku Background study for ratifying the Faro Convention: What people regard as heritage and how to develop the dialogue between authorities and peopleWhy, how and for whose benefit should we
enhance cultural heritage?

Lars Lundqvist, Riksantikvarieämbetet Observations from sustaining the Platsr.se service for historical storytellingLocal history – Platsr.se

#Maptime!


Bert Spaan presented the international learning network Maptime. The idea is to bring together people to learn web mapping tools together. Join MaptimeHEL and prepare for the first meeting!Bert also presented briefly the Dutch historical geocoder project Erfgoed & Locatie.

#Hack4FI hackathon

The hackathon, the first of a kind in Finland, was a tour de force of AvoinGLAM. After the weekend participants have six weeks to finalise their works. The hackathon ends 26 March with a gala, where the final works will be presented and awarded.

Wiki Loves Maps projects were only a part of hacks done over the weekend, and you can see a full list of ideas here. Have a taste of maps projects, or projects by mappers, that were worked on during the event!

Historical street view

Ajapaik–Mapillary

We aim to connect two projects: Ajapaik for using people’s help in locating places of old images and Mapillary for seamlessly stitching those images into a street view. Signe Brander’s Helsinki images are in Ajapaik already and more will follow! The first test transfers from Ajapaik to Mapillary are being made.
By Peter Neubauer @peterneubauer and Vahur Puik @puik, @Ajapaik

Mapillary widget in Wikimedia Commons

Wikimedia Sweden has collaborated with Mapillary to create a map for images in the Wiki Loves Monuments competition. Mapillary images can now also be uploaded to Wikimedia Commons and displayed there as a street view. Add this script to your common.js in your Wikimedia Commons settings.
André Costa @lokal_profil and Peter Neubauer @peternebauer

The Warper widget to display a warped historical map similarly needs to be created in the next hackathon!

Infinipic

Mauricio Giraldo created this infinite photograph zoom out movie makerinspired by Istvan Banyan’s Zoom that uses photographs from the Finnish poet Edith Södergran.

Images are processed in Python with OpenCV to find good transition regions and then animated using Processing.
@mgiraldo

OpenStreetMap – OpenHistoricalMap – (Wikimaps) Warper

Map of OpenStreetMap objects with links to Wikipedia articles

See shapes of buildings and monuments that have Wikipedia articles on a map. Themap background can also be a historical map. By Tuukka Hastrup @tuukkah.

If the background map needs adjusting, go to Wikimaps Warper to fix it! If the link to a Wikipedia article is missing, add Wikidata/Wikipedia tags to objects in OpenStreetMap. If a Wikipedia article is missing, you should write one! (And translate it to several languages.)

Helsinki streets of 1900 in OpenHistoricalMap

You can only store today’s features in OpenStreetMap. This is where OpenHistoricalMap comes in. See OpenHistoricalMap at work inside the Wikimaps Warper. The streets have been stripped from the present-day OSM map based on a scanned map of 1900. You can continue the work, adding, deleting and modifying! OHM integration by Tim Waters @tim_waters, streets by SK53 @SK53onOSM.

For playing around with georectified historical maps and web maps, see this marvellous tutorial Mauricio Giraldo has created.

More wikiprojects

WikiProject Wiki Loves Maps is a hub for tracking missing and new Wikipedia entries for our topics. Wikipedia articles Kluuvinlahden fossiilit (taideteos) and Aleksanteri II (patsas, Helsinki) were inspired by the Saturday morning walk in the Senate square and Aleksi. By Heikki Kastemaa.

We also came up with a plan to brush up Wikipedia article introductions to meet Simple Finnish guidelines, and tag them with specific markup. We will team up with writers and users of these texts. To be continued…

The Finnish National Gallery artists database will be imported to Wikidata, and Finnish Wikipedia will be validated manually against that data. By Kimmo Virtanen.

Mapping the Swedish Public Art Database with Odyssey.js

André Costa visualized the Swedish Public Art Database with Odyssey.js that makes it easy to create narrated maps.
@lokal_profil

The amazing Paul Villavicencio in a bearly hug by Jason Brower. CC NC-BY-SA 2.0 AvoinGLAM

Guests from afar

Ecuadorian Paul Villavicencio studies Information Science in Nagoya, Japan. The words Galleries, Libraries, Archives and Museums caught his attention on the Wiki Loves Maps web pages and he decided to travel to the seminar and take part in the hackathon.

While studying in Ecuador, Paul worked in the university library. The librarian, his friend and colleague, is an advocate of openness and collaboration with memory institutions. The possibility of contributing to his work made Paul choose to join the event.

#Hack4Fi was Paul’s first ever hackathon and he was not sure what to expect. After a bit of disorientation facilitator Sanna helped him join a project team and use his expertise on content retrieval in the Talking Heads project. Seeing memory institutions take part in hacking among others, and participating in cross-disciplinary work was a memorable experience. What made all this worthwhile, is that he hopes to bring back his experiences to his colleague in Ecuador.

We talked with Paul together with Mace Ojala, an information science student and participant in #Hack4FI

Looking forward to the next cultural hackathon. Hope to see you all there!

More info

Now that the event is over, you can find us in
Wikimaps Facebook, wikimaps.wikimedia.fi
Wikimedia Suomi, Wikimedia Sverige, Wikimedia Eesti, Wikimedia Norge and Wikimedia Danmark.
Maptime Helsinki Facebook, @MaptimeHEL
#Hack4FI Facebook group

Wiki Loves Maps is organized by Wikimedia Suomi ry together with the City of Helsinki. The event is part of the Wikimaps Nordic project, supported by the Nordic Culture Fund. Special thanks for the AvoinGLAM network, Forum Virium and the Ministry of Education and Culture for collaboration.
Wikimedia_Finland_logo.svg hki fvh_logo_black_web avoinGLAMpienitr
Norden OKM

We are here. Where do we go next?

12566284653_18e34b2680_z

Wikimaps participated in 3 different international events in November. This was the time to look back to where we have got to and what kind of opportunities the work has opened.

Creating a maker space for location-­based historical storytelling

Bringing Historical Geodata to the Web
New York Public Library, November 5–7

Wikimaps has become part of an ecosystem of initiatives that aim to open and understand the geospatial cultural heritage captured in the documents held in libraries, archives and museums. We were especially happy to be invited to the event Moving Historical Geodata to the Web organized by New York Public Library. The event gathered together actors from the academia, open source and civil sectors from 3 continents to tackle the flow of historical geodata from printed maps to finally making use of it in cultural applications. The goal was to see where projects overlap, and think of common ways to deal with redundancy.

Common ground was sought after with different exercises during two intensive workshop days. In the end of the days, the participants committed to some common goals.

The Wikimaps project commits

to contribute to the OpenHistoricalMap project as a project companion and establish a seamless workflow from old maps to OpenHistoricalMap. The communities are planning future working methods, and we hope to share some of the tasks together.

Another key element of the Wikimaps roadmap has been to be able to use Wikidata as a gazetteer, a database that can connect place names and geographies and their changes through history and across languages. Humphrey Southall, University of Portsmouth, nominated as Educational Institution of the Year in Wikimania this year, has done pioneering work with testing this in practice with the PastPlace gazetteer. The Pelagios project and others have been investigating the idea of the Spinal gazetteer, a gazetteer that is a reference gazetteer for individual gazetteers created in various research projects and initiatives.

In order for Wikidata to serve as a historical gazetteer we must see that

  • the way places and place names are modelled serve historical gazetteers: the alternative names of places can be limited to date ranges or to a specific context.
  • the granularity of places accepted in Wikidata will serve at least gazetteers (towns, villages, hamlets, neighborhoods as well as rivers, islands, lakes etc.) if not even more detailed geographic entities.
  • all geographic elements on maps would be notable enough to be accepted in Wikidata.
  • there will be a good way to include or link to the historical geographic representation.

While waiting for the collected input of the workshop, you can have a look at the slide deck of the participants’ presentations, the participant bios and Lex Berman’s notes for the event. We also shared our experiences about building the Wikimaps community, which appear on separate slides here.

Wikimaps Expedition

State of the Map, Buenos Aires, November 7–9

State of the Map, the yearly congregation of the OpenStreetMap world, was organized for the first time ever in Latin America. Wikimaps was presenting the idea of the Wikimaps Expedition. The project model will try to fuse practices from the successful GLAM projects into the participatory mapping projects as well as the meaningful humanitarian approaches.

Roddsump

Initially thought to be arranged in the Archipelago Sea, in a bilingual area between Finland and Sweden, the expedition would take many forms. The project would engage GLAMs, experts, wikimedians, mappers and locals in a common effort to research the story of a location.

The work would have a preliminary online phase, during which historical maps, photographs and geodata are uploaded to Wikimedia Commons with the help of volunteers and experts. Wikipedians would edit articles and create new ones on the area. Mappers would do historical mapping and upload historical geodata sets to OpenHistoricalMap.

Locals and project participants would get together on location during the preliminary phase through field trips and other meetings, and finally an expedition would take place: collection events, interviews, editathons, mapping parties and workshops, staying together producing and documenting with all different project participants and partners on location for perhaps a week.

Amazing participation

The highlight of the event was the presentation by the schoolchildren of the Río Chico. Their education is entirely carried out over the web, as there is no local educational staff. They have mapped their little village and had travelled to the event to present the project. I missed it on location, but was able to catch up on video.

Wikipedia TOWN

The guidelines for notability and prohibition of conducting original research in Wikipedia have led to the situation that the enormous energy of people doing research on historical personalities and locations cannot be tapped into.

I familiarized with the Japanese project Wikipedia TOWN, that has also set out to tackle these problems. They contribute to 3 different repositories: OpenStreetMap, Wikipedia for notable topics, and LocalWiki for topics that don’t meet the criteria of Wikipedia.

Näyttökuva 2014-11-27 kello 12.54.42

We are here. Where do we go next?

Iberoconf, Buenos Aires, November 21

Finally, I had the opportunity to pull together thoughts from the previous conferences and present at the Iberoconf Latin American meeting of Wikimedia representatives. Thank you to the organizers of the events!

Devise OmniAuth OAuth Strategy for MediaWiki (Wikipedia, WikiMedia Commons)

Authentication of MediaWiki users with a Rails Application using Devise and OmniAuth

Wikimaps is using a customised version of the Mapwarper open source map georectification software as seen on http://mapwarper.net to speak with the Commons infrastructure and running on Wikimedia Foundations Labs servers. We needed a way to allow Commons users to log in easily.  And so I developed the OmniAuth-MediaWiki strategy gem so your Ruby applications can authenticate on Wikimedia wikis, like Wikipedia.org and Wikimedia Commons.

e0974880-2ef0-11e4-9b51-e96f339fe90c

The Wikimaps Warper application uses Devise – it works very nicely with OmniAuth. The above image shows traditional login with username and password and, using OmniAuth, to Wikimedia Commons, GitHub and OpenStreetMap. After clicking the Wikimedia Commons button the user is presented with this:oauth It may not be that pretty, but the user allowing this will redirect back to our app and the user will be logged in. This library used the OmniAuth-OSM library as an initial framework for building upon. The code is on Github here:   https://github.com/timwaters/omniauth-mediawiki The gem on RubyGems is here: https://rubygems.org/gems/omniauth-mediawiki And you can install it by including it in your Gemfile or by doing:

gem install omniauth-mediakwiki

Create new registration

The mediawiki.org registration page is where you would create an OAuth consumer registration for your application. You can specify all Wikimedia wikis or a specific one to work with. Registrations will create a key and secret which will work with your user so you can start developing straight away although currently a wiki admin has to approve each registration before other wiki users can use it.  Hopefully they will change this as more applications move away from HTTP Basic to more secure authentication and authorization strategies in the future! Screenshot from 2014-09-03 21:08:33

Usage

Usage is as per any other OmniAuth 1.0 strategy. So let’s say you’re using Rails, you need to add the strategy to your `Gemfile` alongside OmniAuth:

gem 'omniauth'
gem 'omniauth-mediawiki'

Once these are in, you need to add the following to your `config/initializers/omniauth.rb`:

Rails.application.config.middleware.use OmniAuth::Builder do
 provider :mediawiki, "consumer_key", "consumer_secret"
end

If you are using devise, this is how it looks like in your `config/initializers/devise.rb`:

config.omniauth :mediawiki, "consumer_key", "consumer_secret", 
    {:client_options => {:site => 'http://commons.wikimedia.org' }}

If you would like to use this plugin against a wiki you should pass this you can use the environment variable WIKI_AUTH_SITE to set the server to connect to. Alternatively you can pass the site as a client_option to the OmniAuth config as seen above. If no site is specified the www.mediawiki.org wiki will be used.

Notes

In general see the pages around https://www.mediawiki.org/wiki/OAuth/For_Developers for more information When registering for a new OAuth consumer registration you need to specify the callback url properly. e.g. for development:

http://localhost:3000/u/auth/mediawiki/callback
http://localhost:3000/users/auth/mediawiki/callback

This is different from many other OAuth authentication providers which allow the consumer applications to specify what the callback should be. Here we have to define the URL when we register the application. It’s not possible to alter the URL after the registration has been made. Internally the strategy library has to use `/w/index.php?title=` paths in a few places, like so: :authorize_path => '/wiki/Special:Oauth/authorize', :access_token_path => '/w/index.php?title=Special:OAuth/token', :request_token_path => '/w/index.php?title=Special:OAuth/initiate', This could be due to a bug in the OAuth extension, or due to how the wiki redirects from /wiki/Special pages to /w/index.php pages….. I suspect this may change in the future. Another thing to note is that the mediawiki OAuth implementation uses a cool but non standard way of identifying the user.  OmniAuth and Devise needs a way to get the identity of the user. Calling '/w/index.php?title=Special:OAuth/identify' it returns a JSON Web Token (JWT). The JWT is signed using the OAuth secret and so the library decodes that and gets the user information.

Calling the MediaWIki API

OmniAuth is mainly about authentication – it’s not really about using OAuth to do things on their behalf – but it’s relatively easy to do so if you want to do that. They recommend using it in conjunction with other libraries, for example, if you are using OmniAuth-twitter, you should use the Twitter gem to use the OAuth authentication variables to post tweets. There is no such gem for MediaWiki which uses OAuth. Existing  Ruby libraries such as MediaWiki Gateway and MediaWIki Ruby API currently only use usernames and passwords – but they should be looked at for help in crafting the necessary requests though. So we will have to use the OAuth library and call the MediaWiki API directly: In this example we’ll call the Wikimedia Commons API Within a Devise / OmniAuth setup, in the callback method, you can directly get an OAuth::AccessToken via request.env["omniauth.auth"]["extra"]["access_token"] or you can get the token and secret from request.env["omniauth.auth"]["credentials"]["token"] and request.env["omniauth.auth"]["credentials"]["secret"] Assuming the authentication token and secret are stored in the user model, the following could be used to query the mediawiki API at a later date.

@consumer = OAuth::Consumer.new "consumer_key", "consumer_secret",
            {:site=>"https://commons.wikimedia.org"}
@access_token = OAuth::AccessToken.new(@consumer, user.auth_token, user.auth_secret)
uri = 'https://commons.wikimedia.org/w/api.php?action=query&meta=userinfo&uiprop=rights|editcount&format=json'
resp = @access_token.get(URI.encode(uri))
logger.debug resp.body.inspect
# {"query":{"userinfo":{"id":12345,"name":"WikiUser",
# "rights":["read","writeapi","purge","autoconfirmed","editsemiprotected","skipcaptcha"],
# "editcount":2323}}}

Here we called the Query action for userinfo asking for rights and editcount infomation.