Informal contingency plan or exit strategy for the event that:
- the wiki is no longer viable to be hosted on Fandom (or any platform/hosting service)
- support for the wiki is not appropriate for this platform
- wiki requires more capabilities than what is offered/allowed on the platform
- Fandom servers are down
- one's Internet access is offline
- there are no active wiki moderators, administrators, and contributors
- WARFRAME is discontinued as a games-as-a-service
- an archived version of the wiki is needed to be created
- an offline version of the wiki is needed to be created
- Last updated: Fri, 05 Jul 2024 18:38:13 +0000 (UTC) by User:Cephalon Scientia
Basic Wiki Migration[]
- For a guide on how create an XML dump of the wiki's content - https://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki#Backup_the_content_of_the_wiki_(XML_dump)
- For some software applications that are able to read wiki XML dump files for offline reading - https://en.wikipedia.org/wiki/Wikipedia:Database_download#Dynamic_HTML_generation_from_a_local_XML_database_dump
- For downloading database dumps of Fandom wikis: https://community.fandom.com/wiki/Help:Database_download
- If you want the latest dump you can ask one of the wiki's admins to make a request. Fastest way to contact them is to join the wiki's Discord channel: https://discord.gg/BTJuRqg
Quick access:
Process[]
- Initiation
- User research, define user needs
- Market research, explore alternative solutions out there
- Minimum viable product + prototyping
- Planning
- Project planning, determining scope and team structure. Example artifacts:
- Execution
- Establish legal entity (optional)
- Seed round, initial financing
- Build a team, acquire resources and tools
- Product planning and design
- Build product's infrastructure
- Build product
- Porting content and assets
- Beta stage + stress test + user acceptance testing
- Accept crowdfunding and donations to support development
- Release candidate + user migration
- Long-term support + continued maintenance
- Closing
- Project evaluation ("is this project a success?")
- Continued support
- Or project termination
- Or turn over control of project to another entity
Forking Wiki[]
Hosting Local Wiki[]
- See Software bundles article on MediaWiki wiki for example software bundles that streamline the installation of MediaWiki-based wikis on your local machine.
- See WARFRAME Wiki:Solution Stack#Wiki and Special:Version for dependencies and MediaWiki extensions needed to properly render content on this wiki.
- See Manual:Running MediaWiki on Windows Subsystem for Linux on hosting an instance of MediaWiki on WSL for development purposes
- As of September 2021, there is a command line tool that streamlines the setup of a MediaWiki development environment using Docker. See mw:Cli for more details.
- If on Windows, one must have WSL set up (confirmed by Cephalon Scientia (talk) 21:02, 3 November 2022 (UTC)). Otherwise it should natively work on Linux and macOS platforms.
- Cannot import database dumps above 1024 MiB via Special:Import. Have to update nginx config's
client_max_body_size
otherwise get 413 Request Entity Too Large error. See https://nginx.org/en/docs/http/ngx_http_core_module.html#client_max_body_size. - If you have shell access to MediaWiki installation, recommended to run the importDump.php maintenance script instead.
- Can also use Bitnami's MediaWiki Docker image https://hub.docker.com/r/bitnami/mediawiki
- See mw:MediaWiki-Docker for the Docker-based development environment included with MediaWiki core
- As of September 2021, there is a command line tool that streamlines the setup of a MediaWiki development environment using Docker. See mw:Cli for more details.
Media[]
Guides on mass exporting/downloading images from a wiki:
- https://how-to.fandom.com/wiki/How_to_download_all_image_files_in_a_Wikimedia_Commons_page_or_directory
- https://www.mediawiki.org/wiki/Exporting_all_the_files_of_a_wiki
- https://www.mediawiki.org/wiki/Manual:Grabbers
- https://dev.fandom.com/wiki/DownloadImages
Community Content?[]
Discussions and comment sections will not be ported over most likely.
MediaWiki Configuration[]
Fandom wikis use a modified configuration for their MediaWiki installations compared to the default settings. Examples of what to change:
- Interwiki links, see https://community.fandom.com/wiki/Help:Interwiki_link and https://community.fandom.com/wiki/MediaWiki:Interwiki_map
Prefix | Wiki URL | Example |
---|---|---|
wikipedia | https://en.wikipedia.org/wiki/ | wikipedia:Main Page |
wiktionary | https://en.wiktionary.org/wiki/ | wiktionary:Main Page |
community | https://community.fandom.com/wiki/ | community:Main Page |
mw | https://www.mediawiki.org/wiki/ | mw:MediaWiki |
- Namespaces
mw.config.get('wgFormattedNamespaces')
"-2": "Media" "-1": "Special" 0: "" 1: "Talk" 2: "User" 3: "User talk" 4: "WARFRAME Wiki" 5: "WARFRAME Wiki talk" 6: "File" 7: "File talk" 8: "MediaWiki" 9: "MediaWiki talk" 10: "Template" 11: "Template talk" 12: "Help" 13: "Help talk" 14: "Category" 15: "Category talk" 110: "Forum" 111: "Forum talk" 112: "Conclave" (custom namespace for this wiki) 113: "Conclave talk" (custom namespace for this wiki) 420: "GeoJson" (likely unused) 421: "GeoJson talk" (likely unused) 500: "User blog" 501: "User blog comment" 502: "Blog" 503: "Blog talk" 710: "TimedText" (likely unused) 711: "TimedText talk" (likely unused) 828: "Module" 829: "Module talk" 1200: "Message Wall" 1201: "Thread" (redirects to Message Wall namespace) 1202: "Message Wall Greeting" 2000: "Board" (redirects to https://warframe.fandom.com/f) 2001: "Board Thread" 2002: "Topic" 2900: "Map" 2901: "Map talk"
- Custom CSS stylesheet bundled
- <gallery> tags render customized HTML, using CSS classes within div elements like
wikia-gallery wikia-gallery-caption-below wikia-gallery-position-center wikia-gallery-spacing-small wikia-gallery-border-none wikia-gallery-captions-center wikia-gallery-caption-size-medium wikia-gallery-item
- Entry point URLs
Entry point | Default | Fandom |
---|---|---|
mw:Manual:$wgArticlePath | /w/index.php?title=$1 |
/wiki/$1
|
mw:Manual:$wgScriptPath | /w |
/
|
mw:Manual:index.php | /w/index.php |
/index.php
|
mw:Manual:api.php | /w/api.php |
/api.php
|
mw:Manual:rest.php | /w/rest.php |
/rest.php
|
Fandom Feature Parity[]
- See https://community.fandom.com/wiki/Category:Extensions_enabled_by_default for a list of MediaWiki extensions enabled by default on Fandom wikis.
- See https://community.fandom.com/wiki/Category:Extensions_enabled_on_request for a list of MediaWiki extensions available on request on Fandom wikis.
Decoupling From Fandom Dev Wiki Repository[]
Since "Dev:"
prefix won't work in importing Lua modules from https://dev.fandom.com/wiki/ in vanilla MediaWiki, all of these modules and their dependencies will need to be forked (ideally via Special:Export and Special:Import) and have their own Module pages on the WARFRAME Wiki. Then these imports can use the normal "Module:"
prefix.
local module = require("Dev:ModuleName")
-- Replace with:
local module = require("Module:ModuleName")
The same idea goes for JS scripts imported from the Fandom Dev wiki. Any scripts listed in MediaWiki:ImportJS starting with the prefix "Dev:"
will need to have its source code and dependencies forked to the appropriate MediaWiki page. These may need to be modified because of differences in the CSS stylesheets Fandom uses and the new host environment or may not even work at all due to different feature sets:
- Remove any mention of the interwiki
u:dev:
when importing other JS scripts. These scripts should be hosted on the wiki itself. - Replace
importArticle()
/importArticles()
withmw.loader.using()
available in vanilla MediaWiki
JS Audit[]
EnemyInfoboxSlider.js //Adjusting armor/health/effective health values in enemy infoboxes with a slider Calculator.js (outdated; should be implemented using a Lua module so other contributors can update contents) //So far for Armor calculations Maximization.js (outdated, can probably be removed) //Kselia and Tuna: magic interactive stuff Tooltips.js // Local settings/adjustments to dev:Tooltips.js dev:Tooltips.js (high priority, used in tooltip templates across the wiki; ideally should be implemented as MediaWiki extension) // https://dev.fandom.com/wiki/Tooltips dev:Discord.js (can exist as a simple external Discord invite link, no need for a fancy widget) // https://dev.fandom.com/wiki/Discord // CustomDiscordWidget.js // https://warframe.fandom.com/wiki/MediaWiki:CustomDiscordWidget.js hilycker's customized Discord widget dev:BackToTopButton/code.js // https://dev.fandom.com/wiki/BackToTopButton dev:Countdown/code.js // https://dev.fandom.com/wiki/Countdown dev:EditIntroButton/code.js (QoL feature since content above first section does not have an section edit button) // https://dev.fandom.com/wiki/EditIntroButton dev:ReferencePopups/code.js (<ref> tooltips) // https://dev.fandom.com/wiki/ReferencePopups ImageTooltip.js (depreciated, can be removed) // Image pop up, mainly for Weapon comparsion tables, said Briz5 LimitedEventCall.js (think this is unused, can be removed) //AFD content loader dev:DupImageList/code.js (QoL feature for admins, not really high priority) // https://dev.fandom.com/wiki/DupImageList dev:SeeMoreActivityButton/code.js // https://dev.fandom.com/wiki/SeeMoreActivityButton dev:ProfileTags.js (will break, replace with equivalent solution or not import at all) // https://dev.wikia.com/wiki/ProfileTags, replaces MediaWiki:Common.js/userRightsIcons.js RelicTable.js //For switching between relic refinement levels in the table in individual relic pages CollapseExpandButtons.js //Title says it all NightwaveActs.js (need to decouple hard coded base URL; uses imports from Fandom dev wiki for I18n) // Automating display of active Nightwave acts using an API Countdown.js //User:FINNER 's countdown timer for Plains day/night cycle, etc.. Rotate-c.js //Initially used by the new T:Ability, might expand elsewhere CustomUCPFeatures.js (not compatible with non-Fandom wikis; can be removed) // Highlights admin/mod activity on history and such. Also fixes anon profile links on talk/comments Mainpage.js // All homepage related, mainly for new responsive video slider Mbox.js // For closing AMboxes dev:I18nEdit.js dev:I18nEdit/notice.js // User translations editor UI for i10n as seen on https://warframe.fandom.com/wiki/Special:BlankPage/I18nEdit // Might replace with a more proper I18n system like https://translatewiki.net/ if scripts are ported to a MediaWiki extension dev:OggPlayer.js (may replace with a native MediaWiki extension of similar feature set if possible) // https://dev.fandom.com/wiki/OggPlayer for replacing default Kaltura audio player with a single play audio button
Special:Export For JS[]
Copy paste the below for Special:Export, includes dependencies which was found by CTRL+F "importArticle":
- MediaWiki:Tooltips.js
https://dev.fandom.com/wiki/MediaWiki:Tooltips.js https://dev.fandom.com/wiki/MediaWiki:Tooltips.css
- MediaWiki:OggPlayer.js
https://dev.fandom.com/wiki/MediaWiki:OggPlayer.js https://dev.fandom.com/wiki/MediaWiki:OggPlayer.css https://dev.fandom.com/wiki/MediaWiki:I18n-js/code.js
- MediaWiki:I18nEdit.js
https://dev.fandom.com/wiki/MediaWiki:I18nEdit.js https://dev.fandom.com/wiki/MediaWiki:I18nEdit.css https://dev.fandom.com/wiki/MediaWiki:I18n-js/code.js https://dev.fandom.com/wiki/MediaWiki:Dorui.js https://dev.fandom.com/wiki/MediaWiki:WDSIcons/code.js https://dev.fandom.com/wiki/MediaWiki:BannerNotification.js
- MediaWiki:I18nEdit/notice.js
https://dev.fandom.com/wiki/MediaWiki:I18nEdit/notice.js
- MediaWiki:ReferencePopups/code.js
https://dev.fandom.com/wiki/MediaWiki:ReferencePopups/code.js https://dev.fandom.com/wiki/MediaWiki:ReferencePopups.css https://dev.fandom.com/wiki/MediaWiki:ReferencePopups/jquery.effects.js https://dev.fandom.com/wiki/MediaWiki:Colors/code.js
Web Application Architecture[]
If one wants to fork the wiki, not only do they have to worry about wiki-level concerns, they will also need to plan out the web application architecture to host the actual website if they don't want to use existing wiki farms or other web hosting platforms. In addition, development instances would also require their own architecture for continuous development, continuous integration, and deployment.
- Domain Name System (DNS) name server
- Load balancers
- Content delivery networks (CDN)
- Web server
- Middleware
- Database servers
- Object storage
- Archival storage
- Security and monitoring
See:
- Fandom's engineering blogs:
- Fandom's GitHub organization: https://github.com/Wikia
- Fandom's actual application repository is private according to https://dev.fandom.com/wiki/Repository
- https://www.mediawiki.org/wiki/Manual:MediaWiki_architecture
- Infrastructure as code (IaC) setups on AWS:
- https://www.reddit.com/r/aws/comments/6b83ug/most_highly_available_a_of_hosting_a_mediawiki/ - "So I was recently asked what the best/cheapest way to use AWS for a site based entirely on the mediawiki software (used by wikipedia) yet running on the full AWS stack"
- https://github.com/sdey-sag/mediawiki - "Install and configure mediawiki in AWS using Terraform and Ansible"
- https://github.com/PeterBodifee/MediaWiki-AWS - "Deploy MediaWiki on AWS using Elastic Beanstalk"
- https://medium.com/@mwasnik7/deployed-3-tier-web-application-using-terraform-and-aws-818295cc48e8 - "Deployed 3-tier Web Application using TerraForm and AWS"
- Containerization:
Standard Operating Procedures[]
Data Governance[]
Data governance policies will be need to be put in place to[1]:
- Increasing consistency and confidence in decision making
- Decreasing the risk of regulatory fines
- Improving data security
- Defining and verifying the requirements for data distribution policies
- Designating accountability for information quality
- Enabling better planning by supervisory staff
- Minimizing or eliminating re-work
- Optimizing staff effectiveness
- Establishing process performance baselines to enable improvement efforts
Forked wikis will likely store personal information on databases for user accounts or through web analytics. To ensure this data is handled responsibly and safely, there needs to be a formal data governance policy to abide by.
Data stewards would be the functional role to enforce these policies for compliance.
Code of Conduct[]
Establishing working norms and to foster an environment where anyone feels welcomed. Right now, a lot of this is implied since many long term editors and admins come with a progressive mindset, volunteering their time to contribute information for a greater community with no expectation of any return. The wiki also benefits from the WARFRAME community as a whole thanks to the developers and their campaigns for accessibility and inclusivity. But a formal code of conduct will be nice to guide collaboration in an actual working environment rather than disparate contributors who are intermittent in their contributions.
Long-Term Support[]
See WARFRAME Wiki:Cost Analysis for a glimpse of cost structure.
Development Team[]
For long-term support for hosting the wiki and its wiki community these roles will need to be filled by first-party or third-party organizations:
Business | Technology | Community & Operations |
---|---|---|
|
|
|
- Webmaster
- Domain registrar
- Web developer
- Full stack developer/engineer
- System administrator
- Database administrator
- Site reliability engineer
- Cloud engineer
- DevOps engineer
- Security engineer
- Security steward
- QA engineer
- Build engineer
- Software designer/Solution architect
- UI/UX designer
- Product designer
- Service designer
- Product manager
- Wiki administrator
- Content moderator
- Knowledge manager
- Technical translator/software localizer
- Technical writer
- Customer/community support
- Technical support
- Social media manager/Community engagement
- Search engine optimization (SEO) specialist
- Accountant
- Human resource manager
- Data protection officer/data steward
- Legal consultant/advisor
Further reading:
In Comparison To Fandom's Org Structure[]
Derived from https://about.fandom.com/about.
- Business department
- E.g. strategy, administration, brand development
- Financial department
- E.g. accounting, procurement
- Legal department
- E.g. public policy, compliance, mergers and acquisitions
- Marketing department
- E.g. advertising
- People department (aka human resources)
- E.g. payroll, talent acquisition, onboarding
- Product department
- E.g. product strategy
- Revenue department
- E.g. sales, customer support
- Technology department
- E.g. information technology, engineering, research[2]
- ML Engineering team (machine learning engineering)
- DataEng team (data engineering)
- Ad Engineering team
- Platform team (MediaWiki team)
- CATS team (Creators, Admin Tools & Staff team)
- SiteX team (site experience team)
- Mobile Apps team
- User Generated Content (UGC) team
Scalability[]
In response to growing web traffic to wikis and high demand to the access of timely and accurate information (especially with a live service game), the issue of scalability need to be addressed on these levels:
- Content (highest level)
- In its basic form, text displayed on wiki articles
- Audio
- Images
- Animated images
- Video
- Interactive (e.g. tooltips or forms)
- Web accessibility
- (Software) Development
- Continuous integration and deployment
- Test-driven development
- Source control
- DevSecOps
- Geographic
- Despite being an English wiki, around half of visitors come from countries where English is not the native tongue
- Data
- How data is stored, accessed, manipulated, and duplicated
- Partitioning
- Replication
- Database sharding
- Archival and long-term storage
- Infrastructure (lowest level)
- Content delivery networks
- Load balancers
- Web servers
- Databases
- "Serverless" architectures
- Microservices
- APIs
Passing Institutional Knowledge[]
This wiki heavily relies on templates and Lua modules to structure and automate content generation. If passing the torch to a new wiki host, admin, and/or developer, institutional knowledge is needed to be shared or learned in order to ensure longevity of the wiki. Institutional knowledge can fall under the following categories:
- Game system knowledge and game sense - how WARFRAME is typically played (e.g. the "meta"), situational awareness regarding the game (e.g. what builds to use in what context), and the complex systems that make up the core experience and gameplay loop of the game
- These can be picked up by playing the game at an "advanced" level, reading the wiki about game mechanics, discussing community members about optimal strategies, and being in the community for a long time (how long differs from person to person)
- Community knowledge - how the general WARFRAME community behaves, responds to change, what's currently important to the general audience, and desires from the game
- These can be picked up by lurking and engaging in community forums, groups, and social media over several mainline updates
- Can have crossover with general Internet culture and/or (video) gaming culture
- Includes community-made resources like https://overframe.gg/ and https://warframe.market/ that complement in-game access to information
- Wiki editing knowledge - how wiki articles are typically structured, the collaboration process of editing, wiki's code of conduct and decorum, how and when to use wikitext syntax and certain templates, etc.
- Wiki scripting and development knowledge - how Lua and JavaScript scripts are used and developed to serve content to readers on the wiki; may also include custom stylesheets that the wiki uses on MediaWiki:Common.css.
- These can be picked up by reading documentation on pages in Module and MediaWiki namespaces and reading MediaWiki documentation. See Template:ModuleNav.
Any questions and concerns can be directed to active admins and moderators on the wiki's Discord channel: https://discord.gg/BTJuRqg
Governance[]
See https://meta.wikimedia.org/wiki/Wikimedia_power_structure for examples.
Wiki Archival[]
Online Access[]
Internet Archive has a multitude of snapshots of this wiki's articles. See the sitemap of currently archived articles: https://web.archive.org/web/sitemap/warframe.fandom.com
Other web archival projects:
- https://en.wikipedia.org/wiki/Wikipedia:List_of_web_archives_on_Wikipedia
- https://en.wikipedia.org/wiki/Help:Using_the_Wayback_Machine
- https://en.wikipedia.org/wiki/Help:Using_archive.today
- https://en.wikipedia.org/wiki/Wikipedia:Database_download
Offline Access[]
Printing articles to physical media:
Offline web browser/wiki reader:
Saving wiki content on disk:
- As of 2021-09-29, the XML dump of the wiki (containing only wikitext) is about 8 GB in size. Using LZMA2:24 compression reduces that size to around 210 MB.
- As of 2020-11-10, according to Special:MediaStatistics, the wiki stores about[3]:
- 1.3 GB of .ogg files
- 120 MB of YouTube MIME type (not actual video, probably just thumbnail and basic details; videos are embedded and streamed from YouTube)
- 2.5 MB of .svg files
- 10 GB of .png files
- 4 GB of .jpg files
- 2.3 GB of .gif files
- About 21 GB of media (including article content). If using lossless compression like brotli or zstd to further compress this data (around 2-3 compression ratio), this can be reduced to ~10 GB of media.
- As of 2022-09-14, according to https://warframe.huijiwiki.com/wiki/特殊:媒体统计, the Chinese Huijiwiki site stores less media (probably because of them being a relatively newer wiki, despite having almost 1:1 information parity to the English wiki)[4]:
- 581.29 MB of .ogg files
- 711 KB of .svg files
- 5 GB of .png files
- 1.02 GB of .jpg files
- 1.31 GB of .gif files
- 7.96 GB of media total (excluding article content; doesn't look like Huijiwiki has wiki dump readily available)
- Bundling web pages into HTML files
- monolith CLI (https://github.com/Y2Z/monolith)[5]
- Singlefile web extension and CLI (https://github.com/gildas-lormeau/SingleFile)
Communication Channels[]
Aside from on-wiki communication medium through Talk pages, the wiki would probably continue to use the wiki's Discord server for IRC-like communications unless there is a major conflict of interest between parties involved in the wiki fork.
Otherwise, email can also be used as a universal communication medium.
Alternative enterprise solutions:
- Slack (as Discord alternative or compliment for internal communications)
- Jira (for project management)
Case Studies[]
Forked Wikis[]
Case studies of other community-driven wikis that changed platforms:
- https://allthetropes.org/wiki/All_The_Tropes:Why_Fork_TV_Tropes (2013-11-13)
- https://www.reddit.com/r/runescape/comments/9kqukw/runescape_wiki_leaving_wikia_now_launched_at/ (2018-10-02)
- https://forums.terraria.org/index.php?threads/new-official-terraria-wiki-launches-today.111239/ (2022-03-09)
- https://www.reddit.com/r/Minecraft/comments/16r3y8x/the_minecraft_wiki_has_moved_from_fandom_to/ (2023-09-24)
- https://www.reddit.com/r/wow/comments/176e0jq/wowpedia_has_moved_we_are_now_warcraft_wiki/ (2023-10-12)
Board of Directors[]
Case studies of open communities that formalized a board of directors to define and implement organizational policies that operate wikis or other community-based efforts.
- Weird Gloop Ltd. makes their board meetings public: https://meta.weirdgloop.org/. They meet once every 3 months with annual elections.
- Organization for Transformative Works makes their boarding meetings public: https://www.transformativeworks.org/committees/board-of-directors/. They meet once every 3 months with annual elections.
Stretch Goals?[]
These alternative solutions require lots of research and development. They may not even exist in current-day tech space.
Custom Link Parser For Tooltips[]
This project aims to not rely on wiki templates, modules, and JavaScript scripts on MediaWiki namespace to create our tooltips. Instead, we will use a MediaWiki extension to add additional logic to the parser to turn normal wikilinks into links with tooltips.
See https://wowpedia.fandom.com/wiki/Forum:UsingData_and_transclusion_boilerplates
Decentralized Wikis[]
This project aims to decouple wiki hosting from a single owner and instead distribute hosting to multiple independent hosts. Think torrenting or peer-to-peer networks that follow this similar principle.
- https://www.researchgate.net/publication/351379422_WikiChain_A_Blockchain-Based_Decentralized_Wiki_Framework
- https://en.wikipedia.org/wiki/InterPlanetary_File_System
- https://en.wikipedia.org/wiki/Content-addressable_storage
For a hybrid approach, somehow distribute compute and storage resources across independent machines and have a centralized orchestration platform to manage and monitor multiple independent hosts to serve wiki content at scale, similar to the idea of a content delivery network. May borrow some ideas from web syndication where users can "subscribe" to a "single-source-of-truth" to get content directly instead of relying on a middleman to host that content; it will require the use of the user's compute and storage to effectively host their local copy of the wiki (like a Git repository).
Some core values that this approach follows are:
- Returning agency and autonomy of control to the users; more "democratic"
- Don't have to rely on wiki hosting vendors (wiki farms) to provide centrally-owned MediaWiki instances
- Don't have to rely on cloud computing companies (big or small) to provide centrally-owned hosting; host owners are (ideally) globally distributed
- However, network protocols and communications are centrally defined. Someone needs to set standards on how these hosts communicate with each other to ensure information parity for up-to-date info on the wiki's subject matter.
- Idea of "bring your own compute" and then sharing it with others in terms of hosting infrastructure (application hosting and server hosting)
- Can argue you can already download an offline copy of the wiki, but there's a knowledge barrier to entry
- Wiki project lives and dies by the hands of the community on its own merit
- Censorship resistant; freedom of information; preservation/archival of digital media
- No matter your background, information is readily accessible by you, solely on the basis of having human rights regardless of what societal institution upholds or denies them or whatever systemic pressures block you from exercising your human rights
- Not truly censorship-free because of external factors like not having a phone or internet access
- No matter your background, information is readily accessible by you, solely on the basis of having human rights regardless of what societal institution upholds or denies them or whatever systemic pressures block you from exercising your human rights
- Valuing user privacy
- Network resilience due to content distributed across multiple independent hosts
- The user has more control over their personal data because ideally, there is no incentive to collect them for profitability (avoid user behavior tracking and data collection)
Some possible issues are:
- Cybersecurity concerns
- Malicious actors turning system into a botnet
- The need of centralized authority for certain aspects
- Double-spending problem in decentralized systems
- Who is legally responsible for misuse?
- Network orchestration may need to be centralized, even if distributed
- Not true "democracy"; actual practice may diverge from ideals over time due to systemic pressures and limitations
- Poor performance if not enough compute/storage contributions across the world and overhead with compute-heavy cryptography operations
- Incompatibility with current Web 2.0 technologies; friction in user adoption
- Poor implementation and increased complexity with new technology, higher risk
- Adjacency to cryptocurrency; poor public trust
Language-independent Wikis Using Structured Data[]
This project aims to generate wiki article content from structured data that represent lexical and ontological domain knowledge, allowing for programmatic localization. On a high-level, can think of this project as replacing traditional wiki article content with template calls that run scripts that pull and process data to generate content into the user's desired language. The underlying data that would be used will represent some sort of knowledge map of the domain (in our case, WARFRAME, the video game IP).
One example of a wiki that follows this idea (albeit without a centralized data store for knowledge base) is the Fandom Developers Wiki (https://dev.fandom.com/wiki/Fandom_Developers_Wiki) where their articles use templates to localize content. Examples:
- https://dev.fandom.com/wiki/Global_Lua_Modules/Tabber?action=edit - Canonical English article
- https://dev.fandom.com/wiki/Global_Lua_Modules/Tabber/ru?action=edit - Russian localization messages
- https://dev.fandom.com/wiki/Global_Lua_Modules/Tabber?uselang=ru - View article in Russian using
uselang
query parameter
- https://dev.fandom.com/wiki/Global_Lua_Modules/Tabber?uselang=ru - View article in Russian using
- https://dev.fandom.com/wiki/Global_Lua_Modules/Tabber/ru?action=edit - Russian localization messages
- https://dev.fandom.com/wiki/I18nEdit?action=edit - Canonical English article
- https://dev.fandom.com/wiki/I18nEdit/ru?action=edit - Russian localization messages
- https://dev.fandom.com/wiki/I18nEdit/tr?action=edit - Turkish localization messages
- https://dev.fandom.com/wiki/Tooltips?action=edit - Canonical English article
- https://dev.fandom.com/wiki/Tooltips/es?action=edit - Spanish localization messages
- https://dev.fandom.com/wiki/Tooltips/fr?action=edit - French localization messages
Knowledge Base API / Search Engine[]
A possible byproduct of storing all knowledge in a structured data format is being able to query it, translating human language queries (e.g. "Where do I farm X item?") to programmatic query languages (e.g. pseudocode "for each source in X item's drop locations return name of source"
) to get a result, like a Google search for domain-specific knowledge. Can also extend this to include emerging technologies like generative AI as a chat assistant.
Open Questions[]
- Is it possible use an RSS feed to mirror content? Like have a listener to an RSS feed that updates a function trigger to copy the site contents into local storage for offline reading or rehosting.
- Is SEO the end-all-be-all strategy for website discoverability? Is word-of-mouth/organic approach more desirable for growth in this age of genAI?
- With Google's dominance in SEO and systemic push for profitability, it seems like Google search results are returning poorer quality matches and not give the most accurate or relevant results.[6] Community and grassroots efforts seem to be more favorable, at least from a user's perspective; it's more "democratic" and returns agency and autonomy to the masses.
References[]
- ↑ From wikipedia:Data_governance
- ↑ https://medium.com/fandom-engineering
- ↑ (2020, November 10). Media statistics. Fandom. Accessed 2022-09-14. Archived from the original on 2022-09-14.
- ↑ (2022, September 14). 媒体统计. Huijiwiki. Accessed 2022-09-14. Archived from the original on 2022-09-14.
- ↑ (2024, March 24). Monolith – CLI tool for saving complete web pages as a single HTML file. Hacker News. Accessed 2024-03-25. Archived from the original on 2024-03-25.
- ↑ Germain, Thomas (2024, May 25). Google just updated its algorithm. The Internet will never be the same. BBC. Accessed 2024-06-16. Archived from the original on 2024-06-16.