Peter Murray-Rust
Peter Murray-Rust | |
---|---|
Timezone | Europe/London (GMT+00:00/GMT+01:00) |
Institutional Affiliation(s) | University of Cambridge |
Relevant Projects | https://semanticclimate.github.io/p/en/ |
Topic Interests | Documents, Scientific Publishing, Ontologies textMining Wikidata |
Group(s) | Discourse Modeling, Interdisciplinary Models
|
Discord Handle | handle#1941 |
ORCID | 0000-0003-3386-3972 |
Twitter Handle | petermurrayrust |
GitHub handle | petermr
|
Discord
Page Schemas#Creating a new Schema Page schemas is mostly a handy way to generate boilerplate templates and link them to semantic properties. A Form (using Page Forms is something that is an interface for filling in values for a template.
For an example of how this shakes out, see Category:Participant Template:Participant Form:Participant
- go to a `Category:CategoryName` page, creating it if it doesn't already exist.
- Click "Create schema" in top right
- If you want a form, check the "Form" box. it is possible to make a schema without a form. The schema just defines what pages will be generated, and the generated pages can be further edited afterwards (note that this might make them inconsistent with the schema)
- Click "add template" If you are only planning on having one template per category, name the template the same thing as the category.
- Add fields! Each field can have a corresponding form input (with a type, eg. a textbox, token input, date selector, etc.) and a semantic property.
- Once you're finished, save the schema
- Click "Generate pages" on the category page. Typically you want to uncheck any pages that are already bluelinks so you don't overwrite them. You might have to do the 'generate pages' step a few times, and it can take a few minutes, bc it's pretty buggy.
Workshop Submission
What's your interest in this workshop?
With what "frame" do you approach the workshop? (or identity)?
Tool-builder, Researcher
What materials can you contribute to the workshop for consideration?
HTML versions of the IPCC report. A suite of Python tools for searching, processing, cleaning and analysing, Semantic climate dictionaries linked to Wikidata and hence into the Linked Open Data cloud
Organizer-estimated Topics
Was invited to join (2022-11-04). I enjoy sparking off Open community action in creating interoperable systems for science. Worked with W3C in the creation of XML, which provides a framework for modelling the world. Models are driven by discourse (i.e. what people write) rather than god-given classifications. Henry Rzepa and I created Chemical Markup Language (CML) which is capable of modelling much of published chemistry and interoperates with MathML, HTML and SVG. I have always believed in "Rough consensus and running code" so I have written code (Java) that supports the design, The problems are now not conceptual but sociopolitical (scientists are conservative and prefer PDF which in not machine processable).
The advent of Wikidata (2012-) now means we can map much of our discourse into the Linked Open Data cloud.
Because we model discourse there is a major requirement for text-mining so I and colleagues have developed a suite of tools. pygetpapers (Ayush Garg) is a rapid search and download for Open Access science, docanalysis applies NLP tools and AMI is a framework for analysing and recombining scientific publications. We have recently concentrated on applying this to climate literature and making it semantic