Search by property
From cpt
This page provides a simple browsing interface for finding entities described by a property and a named value. Other available search interfaces include the page property search, and the ask query builder.
List of results
- T100856: Migrate some semantic information from data-parsoid to data-mw +
- T102867: Expose page-global metadata and make it editable +
- T105573: Automate grafana dashboard backups +
- T106346: setup an alertable threshold for Cassandra heap dumps +
- T111597: Devise means for experimental software working with live data +
- T113733: column family cassandra metrics size +
- T114402: Implement something similar to the RESTBase 'section' API to provide wikitext structure information +
- T119043: Graph/Graphoid/Kartographer - data storage architecture +
- T125085: Split the API MediaWiki appserver pool into two external/internal pools +
- T127683: Graphoid should handle font fallback/selection for all Unicode planes +
- T130639: All known clients of Parsoid HTML that require data-mw should fetch data-mw separately (if using RESTBase) or process the data-mw blob in Parsoid's pagebundle API response (if using Parsoid directly) +
- T130643: Content Translation should load data-mw from a separate API call alongside the body content +
- T130689: Google's services should load data-mw from a separate API call to RESTBase +
- T132632: puppetize turning off reserved space for cassandra /srv +
- T133547: set up automated HTML (restbase) dumps on francium +
- T134237: Graphoid returns a 400 on MW API time-out +
- T138093: Investigate query parameter normalization for MW/services +
- T138933: Explore moving the Panoviewer gadget/Toolforge tool into production +
- T139169: Add non-parallel MultiHttpClient fallback for environments that don't have curl available +
- T142090: Add hover-card like summary (og:description) to open graph meta data printing plain summary and headline property in the SameAs schema +
- T143743: Set up the foundation for the ReviewStream feed +
- T145164: Add fields needed by ERI to mediawiki.revision-create +
- T146810: Automate Graphoid deployment to beta cluster (and auto-rebuild?) +
- T147581: RFC: Streamline Node.js testing+deployment +
- T148036: [SPIKE] Investigate solutions for client side event logging +
- T180051: Reduce the number of fields declared in elasticsearch by logstash +
- T180626: [Spike 8hr] How should we limit resources used by chromium render service? +
- T185233: Modern Event Platform +
- T187241: Add page-related topics to EventStreams +
- T187418: Enable multiple topics in EventStreams URL +
- T189641: Service for checking the Pwned Passwords database +
- T191024: Exception thrown while running DataSender::sendData in cluster codfw: Data should be a Document, a Script or an array containing Documents and/or Scripts +
- T199096: Add support for wikidata summaries in the /page/summary/ endpoint +
- T200594: Add client identifier to requests sent from Kartotherian to WDQS +
- T201611: Deploy translation-server-v2 +
- T205919: TEC3:O3:O3.1:Q2 Goal - Move Blubberoid, ZoteroV2, and Graphoid through the production CD Pipeline +
- T206268: Evaluate using TypeScript on node projects +
- T210741: EventStreams process occasionally OOMs +
- T211453: Remove dependency on WDQS for the recommendation API's morelike endpoint +
- T213193: Migrate changeprop to kubernetes +
- T213194: Migrate citoid to kubernetes +
- T213195: Migrate cxserver to kubernetes +
- T213345: Spin off (Parsoid) language variants functionality as a microservice? +
- T213566: Transferring data from Hadoop to production MySQL database +
- T214080: Rewrite Avro schemas (ApiAction, CirrusSearchRequestSet) as JSONSchema and produce to EventGate +
- T214446: EventBus mediawiki extension should support multiple 'event service' endpoints +
- T218812: RFC: Provide the ability to have time-delayed or time-offset jobs in the job queue +
- T219552: Schema Registry HTTP Service +
- T219556: Create schema[12]00[12] (schema.svc.{eqiad,codfw}.wmnet) +
- T222377: Move kartotherian/tilerator logging to new logging pipeline +