Property:Contains text

Showing 20 pages using this property.
T
[[Project Ideas#Linked Data Publishing On Activitypub]] ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like ``` @ bot prefix foaf: https:// (ontology URL) ``` or ``` @ bot alias term: foaf.LongerNameForTerm ``` so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting. I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too  +
I think of all of these tools as "personal hypertext notebooks" - basically taking what is possible in wikis (organizing by means of linking, hypertext) and lowering the barrier to entry (no need to spin up a server, can just download an app and go). The common thread across these notebooks then is allowing for organizing and exploring by means of bidirectional hyperlinks between "notes": - In [[Obsidian]] each linkable note is a markdown file and can be as short or long as you like - in [[Logseq]]/[[Roam]] and other outliner-style notebooks, you can link "pages", and also individual bullets in the outlines on each page. In this way, the core functionality of these tools is similar to a wiki, but they do leave out a lot of the collaborative functionality that makes wikis work well (granular versioning and edit histories, talk pages, etc.). So for folks like <@305044217393053697> who are comfortable with wikis already, they add marginal value IMO. Their technical predecessors in the "personal (vs. collaborative) wiki" space include [[TiddlyWiki]] and [[emacs org-mode]] (and inherit their technical extensibility: many users create their own extensions of the notebooks' functionality. an example is the [[Roam Discourse Graph extension]] that <@824740026575355906> is using). These tools also tend to trace their idea lineage back to vannevar bush's [[Memex]] and ted nelson's [[Xanadu]].  +
[[Project Ideas#Linked Data Publishing On Activitypub]] ooh I'm very interested in this. so are you thinking a [[Twitter#Bridge]] -> [[ActivityPub#Bridge]] where one could use markup within the twitter post to declare [[Linked Data#Markup Syntax]] and then post to AP? I have thought about this kind of thing before, like using a bot command syntax to declare prefixes by doing something like ``` @ bot prefix foaf: https:// (ontology URL) ``` or ``` @ bot alias term: foaf.LongerNameForTerm ``` so that one could do maybe a semantic wikilink like `[ [term::value] ]` either within the tweet or as a reply to it (so the tweet itself doesn't become cluttered/it can become organized post -hoc?). I've also thought about a bridge (I called [[Threadodo]] ) that implements that kind of command syntax to be able to directly archive threads to [[Zenodo]] along with structured information about the author, but this seems more interesting. I can help try and clear some of the groundwork out of the way to make it easier for you and other interested participants to experiment. I have asked around fedi a bunch for a very minimal AP server implementation, and I could try and find one (or we could try and prototype one) if you want to experiment with that :), and I can also document and show you a tweepy-based bot that has an extensible command/parsing system too  +
V
[[Page Schemas#Creating a new Schema]] Page schemas is mostly a handy way to generate boilerplate templates and link them to semantic properties. A Form (using [[Page Forms]] is something that is an interface for filling in values for a template. For an example of how this shakes out, see [[:Category:Participant]] [[Template:Participant]] [[Form:Participant]] * go to a `Category:CategoryName` page, creating it if it doesn't already exist. * Click "Create schema" in top right * If you want a form, check the "Form" box. it is possible to make a schema without a form. The schema just defines what pages will be generated, and the generated pages can be further edited afterwards (note that this might make them inconsistent with the schema) * Click "add template" If you are only planning on having one template per category, name the template the same thing as the category. * Add fields! Each field can have a corresponding form input (with a type, eg. a textbox, token input, date selector, etc.) and a semantic property. * Once you're finished, save the schema * Click "Generate pages" on the category page. Typically you want to uncheck any pages that are already bluelinks so you don't overwrite them. You might have to do the 'generate pages' step a few times, and it can take a few minutes, bc it's pretty buggy.  +
W
[[Page Schemas#Creating a new Schema]] Page schemas is mostly a handy way to generate boilerplate templates and link them to semantic properties. A Form (using [[Page Forms]] is something that is an interface for filling in values for a template. For an example of how this shakes out, see [[:Category:Participant]] [[Template:Participant]] [[Form:Participant]] * go to a `Category:CategoryName` page, creating it if it doesn't already exist. * Click "Create schema" in top right * If you want a form, check the "Form" box. it is possible to make a schema without a form. The schema just defines what pages will be generated, and the generated pages can be further edited afterwards (note that this might make them inconsistent with the schema) * Click "add template" If you are only planning on having one template per category, name the template the same thing as the category. * Add fields! Each field can have a corresponding form input (with a type, eg. a textbox, token input, date selector, etc.) and a semantic property. * Once you're finished, save the schema * Click "Generate pages" on the category page. Typically you want to uncheck any pages that are already bluelinks so you don't overwrite them. You might have to do the 'generate pages' step a few times, and it can take a few minutes, bc it's pretty buggy.  +
I am about to go to bed but personally I favor the model of the federated wiki, that the same "term" or page title in the case of the wiki has many possible realizations, and what's useful is their multiplicity. I think everything2 was an early model of this, but basically it cuts to the core of the history of early wikis, to the initial fork of ward's wiki into meatball. the singularity of meaning as implied by Wikipedia is imo an artifact of wikis having been adopted by encyclopedists, with all the diderot-like enlightenment-era philosophy that entails. this seems exceptionally apt today and yesterday given Aaron Swartz telling of that history , particularly his "[[Who Writes Wikipedia?]]" Everyone can contribute in a linked context, and that's what the synthesis of wikilike thinking, linked data, and distributed messaging gives us :). I write about this idea more completely here: https://jon-e.net/infrastructure/#the-wiki-way after my take on the critical/ethical need for forking in information systems as given by the case study of NIH's biomedical translator (link to most relevant part in the middle of the argument, the justification and motivation precedes it): https://jon-e.net/infrastructure/#problematizing-the-need-for-a-system-intended-to-link-all-or-eve  +
[[Wiki#Organization]] As we get towards proposing projects and organizing ideas, I've added a set of pages for the different concepts that y'all indicated either here or in your applications: https://synthesis-infrastructures.wiki/Concepts Each page should give a list of participants that have a `Interested In` property on their participant page (or you can declare interest on the page using the template (see example at https://synthesis-infrastructures.wiki/Template:Concept ) as another way of finding people with similar interests. Feel free to add additional interests from your own page and add new pages by using the ` {| style="width: 30em; font-size: 90%; border: 1px solid #aaaaaa; background-color: #f9f9f9; color: black; margin-bottom: 0.5em; margin-right: 1em; padding: 0.2em;text-align:left;float:right;clear:right;" ! style="text-align: center; background-color:#ccccff;" colspan="2" |<span style="font-size: larger;">Wiki</span> |- | Interested Participants || |} [[Category:Concept]] ` template on any new page. The pages are all stubs at the moment, but I have made links between related concepts/subconcepts/etc. These will also help us catch any wikilinks made from within the discord 🙂  +
[[Semantic MediaWiki]] vs [[WikiBase]]: you're right! Semantic mediawiki is more for being an interface that can support unstructured and structured information in the same place, it's a lot more freeform and gestural, but at the cost of predictability/strictness/performance as a database. Definitely different tools with different applications, albeit with a decent amount of overlap in philosophy and etc.  +
Nice idea, that [[Wikibot]]! Do I understand correctly that it grabs all messages that contain a page name in double brackets, and adds them to the Wiki page with that name? (this message being as much a test as a question of course)  +
Thanks <@322545403876868096> ! Added to https://synthesis-infrastructures.wiki/Discourse_Modeling. I guess I could have used [[Wikibot]] for that, but it was easier to do it by hand than figuring out the intricacies of Wikibot.  +
Note to <@305044217393053697> about [[Wikibot]]: it doesn't pick up edits on messages that it has already added to the WIki. The version in the Wiki ends up being obsolete. Could be important when someone edits to add "not", for example. Discord users are used to having this possibility.  +
those brackets cue the [[WikiBot]] to link the message to the wiki page containing the mentioned terms  +
i'll leave the bot running for a lil bit but yeah it's just running on my laptop for now, will move it over to the linode running the wiki when i go to switch the url. made a page to document the [[WikiBot#Status Updates]]  +
omg lmao [[WikiBot#TODO]] Don't make a separate page using semantic wikilinks lol  +
Then i just made a page to link to the pages. There's not really a well defined way to do meta-categorization like that in-medium as far as I'm aware, but am happy to receive [[WikiBot#Feature Requests]] about it  +
this is almost exactly the idea with the [[WikiBot]] that pushes to a [[Semantic Wiki]], and good to have a name in [[Gradual Enrichment]]. looking forward to digging though the references and finishing that piece^ tomorrow. (and finishing the n-back linking syntax so I can just directly include the piece in the annotation that is this message). thanks for sharing 🙂  +
<@771783584105234462> [[WikiBot#Bugfixes]] just pushed an update to the wikibot that might fix the red X's you're getting - likely an error from when there isn't an avatar set, but the logs aren't being kept long enough back for me to see for sure.  +
Reminder as the conversations start thickening (which has been great to read, looking forward to jumping in more later when I have a few minutes) and thus become a bit harder to keep track of that you should feel free to make liberal use of [[Wikilinks]] in your posts to archive them in the wiki and make them more discoverable by people outside of your table/project. (For example this message will appear here https://synthesis-infrastructures.wiki/Wikilinks ). This would be especially useful because it looks like some folks are interested in doing some <#1038988750677606432> on the wiki!  +
another group ( <#1038988750677606432> ) will i believe be analyzing the semantic information on the wiki ( https://synthesis-infrastructures.wiki/Main_Page ), and you can archive the text of any message onto a wiki page by using [[Wikilinks]]: ( so eg. this message will go to https://synthesis-infrastructures.wiki/Wikilinks )  +
I've also recently been using logseq. I like how it just writes to markdown. I've been wanting to parse that markdown, look for we--known #hashtags and [[wikitags]], and build an rdf dataset. It looks like SBML is kinda like XML, so maybe something similar is possible there. Have you done anything more with logseq since this post in November?  +