Search by property

This page provides a simple browsing interface for finding entities described by a property and a named value. Other available search interfaces include the page property search, and the ask query builder.

Search by property

A list of all pages that have property "Contains text" with value "[[joel chan]] -> [[decentralized discourse graph]]". Since there have been only a few results, also nearby values are displayed.

Showing below up to 27 results starting with #1.

View (previous 50 | next 50) (20 | 50 | 100 | 250 | 500)


    

List of results

  • Page Forms#sneakers-the-rat-22-11-12 19:32:48  + ([[Page Schemas#Creating a new Schema]] Pag[[Page Schemas#Creating a new Schema]]</br>Page schemas is mostly a handy way to generate boilerplate templates and link them to semantic properties. A Form (using [[Page Forms]] is something that is an interface for filling in values for a template.</br></br>For an example of how this shakes out, see</br>[[:Category:Participant]]</br>[[Template:Participant]]</br>[[Form:Participant]]</br></br>* go to a `Category:CategoryName` page, creating it if it doesn't already exist.</br>* Click "Create schema" in top right</br>* If you want a form, check the "Form" box. it is possible to make a schema without a form. The schema just defines what pages will be generated, and the generated pages can be further edited afterwards (note that this might make them inconsistent with the schema)</br>* Click "add template" If you are only planning on having one template per category, name the template the same thing as the category.</br>* Add fields! Each field can have a corresponding form input (with a type, eg. a textbox, token input, date selector, etc.) and a semantic property.</br>* Once you're finished, save the schema</br>* Click "Generate pages" on the category page. Typically you want to uncheck any pages that are already bluelinks so you don't overwrite them. You might have to do the 'generate pages' step a few times, and it can take a few minutes, bc it's pretty buggy. take a few minutes, bc it's pretty buggy.)
  • Page Schemas#sneakers-the-rat-22-11-12 19:32:48  + ([[Page Schemas#Creating a new Schema]] Pag[[Page Schemas#Creating a new Schema]]</br>Page schemas is mostly a handy way to generate boilerplate templates and link them to semantic properties. A Form (using [[Page Forms]] is something that is an interface for filling in values for a template.</br></br>For an example of how this shakes out, see</br>[[:Category:Participant]]</br>[[Template:Participant]]</br>[[Form:Participant]]</br></br>* go to a `Category:CategoryName` page, creating it if it doesn't already exist.</br>* Click "Create schema" in top right</br>* If you want a form, check the "Form" box. it is possible to make a schema without a form. The schema just defines what pages will be generated, and the generated pages can be further edited afterwards (note that this might make them inconsistent with the schema)</br>* Click "add template" If you are only planning on having one template per category, name the template the same thing as the category.</br>* Add fields! Each field can have a corresponding form input (with a type, eg. a textbox, token input, date selector, etc.) and a semantic property.</br>* Once you're finished, save the schema</br>* Click "Generate pages" on the category page. Typically you want to uncheck any pages that are already bluelinks so you don't overwrite them. You might have to do the 'generate pages' step a few times, and it can take a few minutes, bc it's pretty buggy. take a few minutes, bc it's pretty buggy.)
  • Zenodo#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Threadodo#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Twitter#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Project Ideas#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Linked Data#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • ActivityPub#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Discord Messages#sneakers-the-rat-22-11-03 22:51:30  + ([[Project Ideas#Linked Data Publishing On [[Project Ideas#Linked Data Publishing On Activitypub]]</br></br>ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like</br>```</br>@ bot prefix</br>foaf: https:// (ontology URL)</br>```</br>or</br>```</br>@ bot alias</br>term: foaf.LongerNameForTerm</br>```</br>so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). </br></br>I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting.</br></br>I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too has an extensible command/parsing system too)
  • Discord Messages#sneakers-the-rat-22-11-12 18:40:38  + ([[Semantic MediaWiki]] vs [[WikiBase]]: yo[[Semantic MediaWiki]] vs [[WikiBase]]: you're right! Semantic mediawiki is more for being an interface that can support unstructured and structured information in the same place, it's a lot more freeform and gestural, but at the cost of predictability/strictness/performance as a database. Definitely different tools with different applications, albeit with a decent amount of overlap in philosophy and etc.t amount of overlap in philosophy and etc.)
  • WikiBase#sneakers-the-rat-22-11-12 18:40:38  + ([[Semantic MediaWiki]] vs [[WikiBase]]: yo[[Semantic MediaWiki]] vs [[WikiBase]]: you're right! Semantic mediawiki is more for being an interface that can support unstructured and structured information in the same place, it's a lot more freeform and gestural, but at the cost of predictability/strictness/performance as a database. Definitely different tools with different applications, albeit with a decent amount of overlap in philosophy and etc.t amount of overlap in philosophy and etc.)
  • Semantic MediaWiki#sneakers-the-rat-22-11-12 18:40:38  + ([[Semantic MediaWiki]] vs [[WikiBase]]: yo[[Semantic MediaWiki]] vs [[WikiBase]]: you're right! Semantic mediawiki is more for being an interface that can support unstructured and structured information in the same place, it's a lot more freeform and gestural, but at the cost of predictability/strictness/performance as a database. Definitely different tools with different applications, albeit with a decent amount of overlap in philosophy and etc.t amount of overlap in philosophy and etc.)
  • Discord Messages#joelchan86-22-11-13 03:44:39  + ([[Source]] for the figure in the previous msg: https://assets.pubpub.org/5nv701md/01521405455055.pdf)
  • Source#joelchan86-22-11-13 03:44:39  + ([[Source]] for the figure in the previous msg: https://assets.pubpub.org/5nv701md/01521405455055.pdf)
  • Testing Wikibot2#sneakers-the-rat-22-11-02 07:43:07  + ([[Testing Wikibot2]])
  • Discord Messages#sneakers-the-rat-22-11-02 07:43:07  + ([[Testing Wikibot2]])
  • Testing Wikibot#sneakers-the-rat-22-11-02 07:43:01  + ([[Testing Wikibot]])
  • Testing Wikibot#sneakers-the-rat-22-11-01 02:02:02  + ([[Testing Wikibot]])
  • Testing Wikibot#sneakers-the-rat-22-10-31 23:55:22  + ([[Testing Wikibot]])
  • Discord Messages#sneakers-the-rat-22-11-02 07:43:01  + ([[Testing Wikibot]])
  • Discord Messages#sneakers-the-rat-22-11-01 02:02:02  + ([[Testing Wikibot]])
  • Discord Messages#sneakers-the-rat-22-10-31 23:55:22  + ([[Testing Wikibot]])
  • Discord Messages#sneakers-the-rat-22-11-05 01:23:44  + ([[Wiki#Organization]] As we get towards pr[[Wiki#Organization]]</br>As we get towards proposing projects and organizing ideas, I've added a set of pages for the different concepts that y'all indicated either here or in your applications: https://synthesis-infrastructures.wiki/Concepts</br></br>Each page should give a list of participants that have a `Interested In` property on their participant page (or you can declare interest on the page using the template (see example at https://synthesis-infrastructures.wiki/Template:Concept ) as another way of finding people with similar interests. Feel free to add additional interests from your own page and add new pages by using the `</br>{| style="width: 30em; font-size: 90%; border: 1px solid #aaaaaa; background-color: #f9f9f9; color: black; margin-bottom: 0.5em; margin-right: 1em; padding: 0.2em;text-align:left;float:right;clear:right;"</br>! style="text-align: center; background-color:#ccccff;" colspan="2" |<span style="font-size: larger;">Discord Messages</span></br>|-</br>| Interested Participants || </br>|}</br>[[Category:Concept]]</br></br></br>` template on any new page. The pages are all stubs at the moment, but I have made links between related concepts/subconcepts/etc. These will also help us catch any wikilinks made from within the discord 🙂 us catch any wikilinks made from within the discord 🙂)
  • Wiki#sneakers-the-rat-22-11-05 01:23:44  + ([[Wiki#Organization]] As we get towards pr[[Wiki#Organization]]</br>As we get towards proposing projects and organizing ideas, I've added a set of pages for the different concepts that y'all indicated either here or in your applications: https://synthesis-infrastructures.wiki/Concepts</br></br>Each page should give a list of participants that have a `Interested In` property on their participant page (or you can declare interest on the page using the template (see example at https://synthesis-infrastructures.wiki/Template:Concept ) as another way of finding people with similar interests. Feel free to add additional interests from your own page and add new pages by using the `</br>{| style="width: 30em; font-size: 90%; border: 1px solid #aaaaaa; background-color: #f9f9f9; color: black; margin-bottom: 0.5em; margin-right: 1em; padding: 0.2em;text-align:left;float:right;clear:right;"</br>! style="text-align: center; background-color:#ccccff;" colspan="2" |<span style="font-size: larger;">Wiki</span></br>|-</br>| Interested Participants || </br>|}</br>[[Category:Concept]]</br></br></br>` template on any new page. The pages are all stubs at the moment, but I have made links between related concepts/subconcepts/etc. These will also help us catch any wikilinks made from within the discord 🙂 us catch any wikilinks made from within the discord 🙂)
  • Discord Messages#Wutbot-22-11-23 18:59:16  + ([[claim]] claims and questions dominate in[[claim]] claims and questions dominate in natural conversation; the imbalance of sources & evidence is quite stark. This aligns with my mental model of *conversational charity*, where we assume our interlocutors *could* ground their statements in evidence if pressed, but skip this step in the interest of time., but skip this step in the interest of time.)
  • Claim#Wutbot-22-11-23 18:59:16  + ([[claim]] claims and questions dominate in[[claim]] claims and questions dominate in natural conversation; the imbalance of sources & evidence is quite stark. This aligns with my mental model of *conversational charity*, where we assume our interlocutors *could* ground their statements in evidence if pressed, but skip this step in the interest of time., but skip this step in the interest of time.)
  • Open Science Framework#joelchan86-22-11-13 03:43:07  + (a bit further afield, i'd point to the [[Open Science Framework]]a bit further afield, i'd point to the [[Open Science Framework]] as a thoughtful case study in incentive mechanism design focused on integration into *intrinsic* benefits (i'm more thoughtful about my science, i can easily document things so i don't forget them)</br></br>this podcast interview is a decent look into how he thinks about things: https://everythinghertz.com/69</br></br>if i read him right, i sort of agree that infrastructure (possibility) an usability and communities (norms) are prior to / foundational to incentives and policy. top-down incentives and policies that don't align with existing norms and usable practices may risk incentivizing 'just comply with it' practices or just fall flat, like some data sharing mandates.all flat, like some data sharing mandates.)
  • Discord Messages#joelchan86-22-11-13 03:43:07  + (a bit further afield, i'd point to the [[Open Science Framework]]a bit further afield, i'd point to the [[Open Science Framework]] as a thoughtful case study in incentive mechanism design focused on integration into *intrinsic* benefits (i'm more thoughtful about my science, i can easily document things so i don't forget them)</br></br>this podcast interview is a decent look into how he thinks about things: https://everythinghertz.com/69</br></br>if i read him right, i sort of agree that infrastructure (possibility) an usability and communities (norms) are prior to / foundational to incentives and policy. top-down incentives and policies that don't align with existing norms and usable practices may risk incentivizing 'just comply with it' practices or just fall flat, like some data sharing mandates.all flat, like some data sharing mandates.)
  • Discord Messages#joelchan86-22-11-03 02:56:36  + (ah, that is both informative and sad to heah, that is both informative and sad to hear. i think ahead of its time is a reasonable diagnosis.</br></br>[[ScholOnto]] I think was also ahead of its time: had a working prototype integration into a Word processor for directly authoring discourse-graph like things while drafting a manuscript (described here: https://onlinelibrary.wiley.com/doi/abs/10.1002/int.20188)brary.wiley.com/doi/abs/10.1002/int.20188))
  • ScholOnto#joelchan86-22-11-03 02:56:36  + (ah, that is both informative and sad to heah, that is both informative and sad to hear. i think ahead of its time is a reasonable diagnosis.</br></br>[[ScholOnto]] I think was also ahead of its time: had a working prototype integration into a Word processor for directly authoring discourse-graph like things while drafting a manuscript (described here: https://onlinelibrary.wiley.com/doi/abs/10.1002/int.20188)brary.wiley.com/doi/abs/10.1002/int.20188))
  • Source#joelchan86-22-11-13 03:36:24  + (another classic [[Source]] on [[Infrastructure]] is Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces)
  • Infrastructure#joelchan86-22-11-13 03:36:24  + (another classic [[Source]] on [[Infrastructure]] is Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces)
  • Worm Community System (WCS)#joelchan86-22-11-13 03:36:24  + (another classic [[Source]] on [[Infrastructure]] is Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces)
  • Discord Messages#joelchan86-22-11-13 03:36:24  + (another classic [[Source]] on [[Infrastructure]] is Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces)
  • Wikilinks#sneakers-the-rat-22-11-10 21:38:12  + (another group ( <#1038988750677606432&ganother group ( <#1038988750677606432> ) will i believe be analyzing the semantic information on the wiki ( https://synthesis-infrastructures.wiki/Main_Page ), and you can archive the text of any message onto a wiki page by using [[Wikilinks]]: ( so eg. this message will go to https://synthesis-infrastructures.wiki/Wikilinks )tps://synthesis-infrastructures.wiki/Wikilinks ))
  • Discord Messages#sneakers-the-rat-22-11-10 21:38:12  + (another group ( <#1038988750677606432&ganother group ( <#1038988750677606432> ) will i believe be analyzing the semantic information on the wiki ( https://synthesis-infrastructures.wiki/Main_Page ), and you can archive the text of any message onto a wiki page by using [[Wikilinks]]: ( so eg. this message will go to https://synthesis-infrastructures.wiki/Wikilinks )tps://synthesis-infrastructures.wiki/Wikilinks ))
  • Discourse modeling#Flancian-22-11-13 16:06:41  + (apologies I didn't make it to [[discourse modeling]]!)
  • Discord Messages#Flancian-22-11-13 16:06:41  + (apologies I didn't make it to [[discourse modeling]]!)
  • DIY Algorithms#sneakers-the-rat-22-12-20 10:34:44  + (check this out. [[DIY Algorithms]]check this out. [[DIY Algorithms]]. instead of adding accounts to lists and autopopulating, you can directly add posts themselves. so then you can rig up whatever the frick algorithm you want to masto:</br> https://social.coop/@jonny/109545449455062668</br>https://github.com/sneakers-the-rat/mastodon/tree/feature/postlistsrs-the-rat/mastodon/tree/feature/postlists)
  • Discord Messages#sneakers-the-rat-22-12-20 10:34:44  + (check this out. [[DIY Algorithms]]check this out. [[DIY Algorithms]]. instead of adding accounts to lists and autopopulating, you can directly add posts themselves. so then you can rig up whatever the frick algorithm you want to masto:</br> https://social.coop/@jonny/109545449455062668</br>https://github.com/sneakers-the-rat/mastodon/tree/feature/postlistsrs-the-rat/mastodon/tree/feature/postlists)
  • Garden and Stream#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • The Google Docs Problem#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Mediawiki#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Synchronous Editing#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Live Editing#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Context of Interoperability#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Interoperability#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Discord Messages#sneakers-the-rat-22-11-14 04:21:58  + (encouraging the use of the thread for the encouraging the use of the thread for the sake of people's notifications as we enter slow-mode. </br></br>sidebar: this to me is one of the more interesting uses of this kind of wiki-bot, in a more long-lived chat and communication medium (glad 2 have <@708787219992805407> here for the long-timescales perspective btw). in both this and any future workshops, being able to plug in something like a wikibot that can let different threads get tagged to common concepts through time to different/overlapping discord servers and output to potentially multiple overlapping wikis is v interesting to me.</br></br> I'm gonna continue to make it easier to deploy because i feel like the [[Garden and Stream]] metaphor is one that can unfold on multiple timescales, and it would be cool to build out the ability to make that easier: how cool would it be if you didn't have to decide on a chat/document medium or have to make a new set at the start of an organizing project since it was arbitrary anyway and your infra supported use and crossposting across many media. </br></br>Eg. the very understanding surfacing of [[The Google Docs Problem]] because of [[Mediawiki]]'s lack of [[Synchronous Editing]] [[Live Editing]] and the need to remember to link out to external services rather than that being a natural expectation of a multimodal group and having systems that explicitly support that is illustrative to me. Maybe one description is being able to deploy a [[Context of Interoperability]] [[Interoperability]]: during this time period I am intending these documents/discord servers/hashtags/social media accounts/etc. to be able to crosspost between each other so that everyone needs to to as little as possible to make their workflows alignlittle as possible to make their workflows align)
  • Discord Messages#sneakers-the-rat-22-11-03 23:11:44  + (hello Matthew! very curious about this. Ashello Matthew! very curious about this. As someone not familiar with materials science, I'm curious if you could say more about what [[OPTIMADE]] does in this case? Is the idea that the zenodo plugin parses some paper, and then sends it to other listening clients that the parsed data comes from the paper? is it a vocabulary, or communication protocol, or both? and what kind of information would it be parsing/do materials scientists want to be able to analyze in an automated way? sorry if I am being dense, just curious because I've always admired materials but have had very little exposure.terials but have had very little exposure.)
  • OPTIMADE#sneakers-the-rat-22-11-03 23:11:44  + (hello Matthew! very curious about this. Ashello Matthew! very curious about this. As someone not familiar with materials science, I'm curious if you could say more about what [[OPTIMADE]] does in this case? Is the idea that the zenodo plugin parses some paper, and then sends it to other listening clients that the parsed data comes from the paper? is it a vocabulary, or communication protocol, or both? and what kind of information would it be parsing/do materials scientists want to be able to analyze in an automated way? sorry if I am being dense, just curious because I've always admired materials but have had very little exposure.terials but have had very little exposure.)