Categories

RSS Aggregator

LoCloud is a Best Practice Network of 32 partners, co-funded under the CIP ICT-PSP Programme of the [...]

This past Monday, in my musing, I mentioned Kanu Hawaii–a nonprofit that recruits people to take act [...]

The iPad market is saturated. Tablets are gadgets for a largish, elite niche. So, as a technology, t [...]

The Art & Science of Curation is a project which explores ideas around Curation and the role of [...]

New Byzantine texts were added to the Thesaurus Linguae Graecae on 22 April 2014 0082 APOLLONIUS DYS [...]

(10) metadata entry Contribution: Susanne Uhlirz Name: Susanne Uhlirz URL: link to the original post [...]

Archäologie und Computer 2007. Workshop 12 Wien 2008. PDF-Files auf CD-ROM Preis: zehn Euro ISBN 978 [...]

The following is an excerpt from a Program Update by Christa Williford, with contributions from Amy [...]

Todays list of Open Access (free to read) Archaeology articles:STAC: The Severe Terrain Archaeologic [...]

Personal Digital Archiving 2014. Photo by Bill Lefurgy. Cinda May, a key organizer of the Personal D [...]

Google has released all its old Google Street View pictures, so we can travel back in time…. We’ve g [...]

New Voices In Classical Reception Studies Conference Proceedings Volume 1 Conference Proceedings Vol [...]

At the Inaugural Texas Digital Humanities Consortium Conference (TXDHC) on April 12, Elijah Meeks su [...]

Irmengard MAYER1 / Marina DÖRING-WILLIAMS1/ Georgios TOUBEKIS2 / Michael JANSEN2 / Michael PETZET3 ( [...]

Filippo SUSCA (Dipartimento di Progettazione dell’Architettura, Facoltà di Architettura di Firenze, [...]

Top Subscribed RSS

Top Contributors

Week 48: A SCAPE Developer Short Story

It’s been two weeks since the internal SCAPE developer workshop in Brno, Czech Republic. It was a great workshop. We had a lot of presentations and demos, and were brought up to date on what’s going on in the other corners of the SCAPE project. We also had some (loud) discussions, but I think we came to some good agreements on where we as developers are going next. And we started a number of development and productisation activities. I came home with a long list of things to do next week (this ended up not at all being what I did last week, but I still have the list, so next week, fingers crossed). Tasks for week 48:

  • xcorrSound
    • make versioning stable and meaningful (this I looked at together with my colleague in week 48)
    • release new version (this one we actually did)
    • finish writing nice microsite
    • tell my colleague to finish writing small website, where you can test the xcorrSound tools without installing them yourself
    • write unit tests
    • introduce automatic rpm packaging?
    • finish xcorrSound Hadoop job
    • do the xcorrSound Hadoop Testbed Experiment
      • Update the corresponding user story on the wiki
      • Write the new evaluation on the wiki
    • finish the full Audio Migration + QA Hadoop job
    • do the full Audio Migration + QA Hadoop Testbed Experiment
      • Update the corresponding user story on the wiki
      • Write the new evaluation on the wiki
    • write a number of new blog posts about xcorrsound and SCAPE testbed experiments
    • new demo of xcorrsound for the SCAPE all-staff meeting in February
  • SCAPE testbed demonstrations
    • define the demos that we at SB are going to do as part of testbed (this one we also did in week 48; the actual demos we’ll make next year)
  • FITS experiment (hopefully not me, but a colleague)
  • JPylyzer experiment (hopefully me, but a colleague)
  • Mark FFprobe experiment as not active
  • … there are some more points for the next months, but I’ll spare you…

So what did I do in week 48? Well, I sort of worked on the JPylyzer experiment, which is on the list above. In the Digital Preservation Technology Development department at SB we are currently working on a large scale digitized newspapers ingest workflow including QA. As part of this work we run JPylyzer from Hadoop on all the ingested files, and then validate a number of properties using Schematron. These properties come from the requirements to the digitization company, but in SCAPE context these properties should come from policies, so there is still some work to do for the experiment. But running JPylyzer from Hadoop, and validating properties from the JPylyzer output using Schematron now seems to work in the SB large scale digitized newspapers ingest project :-)

And for now I’ll put week 50 on the above list, and when I have finished a sufficient number of bullet points I’ll blog again! This post is missing links, so I hope you can read it without.

Preservation Topics: 

(40)

Share
metadata entry

Contribution: BoletteJurik

Name: BoletteJurik

URL: link to the original post

Entry: http://www.openplanetsfoundation.org/blogs/2013-12-04-week-48-scape-developer-short-story

Language: English

Format: text/html