call_end

    • Pl chevron_right

      Matthew Garrett: My a11y journey

      news.movim.eu / PlanetGnome • 20 June • 3 minutes

    23 years ago I was in a bad place. I'd quit my first attempt at a PhD for various reasons that were, with hindsight, bad, and I was suddenly entirely aimless. I lucked into picking up a sysadmin role back at TCM where I'd spent a summer a year before, but that's not really what I wanted in my life. And then Hanna mentioned that her PhD supervisor was looking for someone familiar with Linux to work on making Dasher , one of the group's research projects, more usable on Linux. I jumped.

    The timing was fortuitous. Sun were pumping money and developer effort into accessibility support, and the Inference Group had just received a grant from the Gatsy Foundation that involved working with the ACE Centre to provide additional accessibility support. And I was suddenly hacking on code that was largely ignored by most developers, supporting use cases that were irrelevant to most developers. Being in a relatively green field space sounds refreshing, until you realise that you're catering to actual humans who are potentially going to rely on your software to be able to communicate. That's somewhat focusing.

    This was, uh, something of an on the job learning experience. I had to catch up with a lot of new technologies very quickly, but that wasn't the hard bit - what was difficult was realising I had to cater to people who were dealing with use cases that I had no experience of whatsoever. Dasher was extended to allow text entry into applications without needing to cut and paste. We added support for introspection of the current applications UI so menus could be exposed via the Dasher interface, allowing people to fly through menu hierarchies and pop open file dialogs. Text-to-speech was incorporated so people could rapidly enter sentences and have them spoke out loud.

    But what sticks with me isn't the tech, or even the opportunities it gave me to meet other people working on the Linux desktop and forge friendships that still exist. It was the cases where I had the opportunity to work with people who could use Dasher as a tool to increase their ability to communicate with the outside world, whose lives were transformed for the better because of what we'd produced. Watching someone use your code and realising that you could write a three line patch that had a significant impact on the speed they could talk to other people is an incomparable experience. It's been decades and in many ways that was the most impact I've ever had as a developer.

    I left after a year to work on fruitflies and get my PhD, and my career since then hasn't involved a lot of accessibility work. But it's stuck with me - every improvement in that space is something that has a direct impact on the quality of life of more people than you expect, but is also something that goes almost unrecognised. The people working on accessibility are heroes. They're making all the technology everyone else produces available to people who would otherwise be blocked from it. They deserve recognition, and they deserve a lot more support than they have.

    But when we deal with technology, we deal with transitions. A lot of the Linux accessibility support depended on X11 behaviour that is now widely regarded as a set of misfeatures. It's not actually good to be able to inject arbitrary input into an arbitrary window, and it's not good to be able to arbitrarily scrape out its contents. X11 never had a model to permit this for accessibility tooling while blocking it for other code. Wayland does, but suffers from the surrounding infrastructure not being well developed yet. We're seeing that happen now, though - Gnome has been performing a great deal of work in this respect, and KDE is picking that up as well. There isn't a full correspondence between X11-based Linux accessibility support and Wayland, but for many users the Wayland accessibility infrastructure is already better than with X11.

    That's going to continue improving, and it'll improve faster with broader support. We've somehow ended up with the bizarre politicisation of Wayland as being some sort of woke thing while X11 represents the Roman Empire or some such bullshit, but the reality is that there is no story for improving accessibility support under X11 and sticking to X11 is going to end up reducing the accessibility of a platform.

    When you read anything about Linux accessibility, ask yourself whether you're reading something written by either a user of the accessibility features, or a developer of them. If they're neither, ask yourself why they actually care and what they're doing to make the future better.

    comment count unavailable comments
    • Pl chevron_right

      Michael Meeks: 2025-06-19 Thursday

      news.movim.eu / PlanetGnome • 19 June

    • Up early, tech planning call in the morning, mail catch-up, admin and TORF pieces.
    • Really exited to see the team get the first COOL 25.04 release shipped , coming to a browser near you:

      Seems our videos are getting more polished over time too which is good.
    • Mail, admin, compiled some code too.
    • Pl chevron_right

      Peter Hutterer: libinput and tablet tool eraser buttons

      news.movim.eu / PlanetGnome • 19 June • 3 minutes

    This is, to some degree, a followup to this 2014 post . The TLDR of that is that, many a moon ago, the corporate overlords at Microsoft that decide all PC hardware behaviour decreed that the best way to handle an eraser emulation on a stylus is by having a button that is hardcoded in the firmware to, upon press, send a proximity out event for the pen followed by a proximity in event for the eraser tool. Upon release, they dogma'd, said eraser button shall virtually move the eraser out of proximity followed by the pen coming back into proximity. Or, in other words, the pen simulates being inverted to use the eraser, at the push of a button. Truly the future, back in the happy times of the mid 20-teens.

    In a world where you don't want to update your software for a new hardware feature, this of course makes perfect sense. In a world where you write software to handle such hardware features, significantly less so.

    Anyway, it is now 11 years later, the happy 2010s are over, and Benjamin and I have fixed this very issue in a few udev-hid-bpf programs but I wanted something that's a) more generic and b) configurable by the user. Somehow I am still convinced that disabling the eraser button at the udev-hid-bpf level will make users that use said button angry and, dear $deity, we can't have angry users, can we? So many angry people out there anyway, let's not add to that.

    To get there, libinput's guts had to be changed. Previously libinput would read the kernel events, update the tablet state struct and then generate events based on various state changes. This of course works great when you e.g. get a button toggle, it doesn't work quite as great when your state change was one or two event frames ago (because prox-out of one tool, prox-in of another tool are at least 2 events). Extracing that older state change was like swapping the type of meatballs from an ikea meal after it's been served - doable in theory, but very messy.

    Long story short, libinput now has a internal plugin system that can modify the evdev event stream as it comes in. It works like a pipeline, the events are passed from the kernel to the first plugin, modified, passed to the next plugin, etc. Eventually the last plugin is our actual tablet backend which will update tablet state, generate libinput events, and generally be grateful about having fewer quirks to worry about. With this architecture we can hold back the proximity events and filter them (if the eraser comes into proximity) or replay them (if the eraser does not come into proximity). The tablet backend is none the wiser, it either sees proximity events when those are valid or it sees a button event (depending on configuration).

    This architecture approach is so successful that I have now switched a bunch of other internal features over to use that internal infrastructure (proximity timers, button debouncing, etc.). And of course it laid the ground work for the (presumably highly) anticipated Lua plugin support . Either way, happy times. For a bit. Because for those not needing the eraser feature, we've just increased your available tool button count by 100%[2] - now there's a headline for tech journalists that just blindly copy claims from blog posts.

    [1] Since this is a bit wordy, the libinput API call is just libinput_tablet_tool_config_eraser_button_set_button()
    [2] A very small number of styli have two buttons and an eraser button so those only get what, 50% increase? Anyway, that would make for a less clickbaity headline so let's handwave those away.

    • Pl chevron_right

      Marcus Lundblad: Midsommer Maps

      news.movim.eu / PlanetGnome • 18 June • 4 minutes

    As tradition has it, it's about time for the (Northern Hemisphere) summer update on the happenings around Maps!

    about-49.alpha.png
    About dialog for GNOME Maps 49.alpha development


    Bug Fixes

    Since the GNOME 48 release in March, there's been some bug fixes, such as correctly handling daylight savings time in public transit itineraries retrieved from Transitous. Also James Westman fixed a regression where the search result popover wasn't showing on small screen devices (phones) because of sizing issues.

    More Clickable Stuff

    More symbols can now be directly selected in the map view by clicking/tapping on there symbols, like roads and house numbers (and then also, like any other POI can be marked as favorites).
    place-bubble-road.png
    Showing place information for the AVUS motorway in Berlin

    And related to traffic and driving, exit numbers are now shown for highway junctions (exits) when available.
    motorway-exit-right.png
    Showing information for a highway exit in a driving-on-the-right locallity

    motorway-exit-left.png
    Showing information for a highway exit in a driving-on-the-left locallity

    Note how the direction the arrow is pointing depends on the side of the road vehicle traffic drives in the country/territoy of the place…
    Also the icon for the “Directions” button shows a “turn off left” mirrored icon now for places in drives-on-the-left countries as an additional attention-to-detail.

    Furigana Names in Japanese

    Since some time (around when we re-designed the place information “bubbles”) we show the native name for place under the name translated in the user's locale (when they are different).
    As there exists an established OpenStreetMap tag for phonetic names in Japanese (using Hiragana), name:ja-Hira akin to Furigana ( https://en.wikipedia.org/wiki/Furigana ) used to aid with pronounciation of place names. I had been thinking that it might be a good idea to show this when available as the dimmed supplimental text in the cases where the displayed name and native names are identical, and the Hiragana name is available. E.g. when the user's locale is Japanese and looking at Japanese names.  For other locales in these cases the displayed name would typically be the Romaji name with the Japanese full (Kanji) name displayed under it as the native name.
    So, I took the opportunity to discuss this with my college Daniel Markstedt, who speaks fluent Japanese and has lived many years in Japan. As he like the idea, and demo of it, I decided to go ahead with this!
    hiragana-name.png
    Showing a place in Japanese with supplemental Hiragana name

    Configurable Measurement Systems

    Since like the start of time, Maps has  shown distances in feet and miles when using a United States locale (or more precisely when measurements use such a locale, LC_MEASUREMENT when speaking about the environment variables). For other locales using standard metric measurements.
    Despite this we have several times recieved bug reports about Maps not  using the correct units. The issue here is that many users tend to prefer to have their computers speaking American English.
    So, I finally caved in and added an option to override the system default.
    hamburger-preferences.png
    Hamburger menu

    measurement-system-menu.png
    Hamburger menu showing measurement unit selection

    Station Symbols

    One feature I had been wanted to implement since we moved to vector tiles and integrated the customized highway shields from OpenStreeMap Americana is showing localized symbols for e.g. metro stations. Such as the classic “roundel” symbol used in London, and the ”T“ in Stockholm.
    After adding the network:wikidata tag to the pre-generated vector tiles this has been possible to implement. We choose to rely on the Wikidata tag instead of the network name/abbreviations as this is more stable and names could risk getting collitions with unrelated networks having the same (short-) name.
    hamburg-u-bahn.png
    U-Bahn station in Hamburg

    copenhagen-metro.png
    Metro stations in Copenhagen

    boston-t.png
    Subway stations in Boston

    berlin-s-bahn.png
    S-Bahn station in Berlin

    This requires the stations being tagged consitently to work out. I did some mass tagging of metro stations in Stockholm, Oslo, and Copenhagen. Other than that I mainly choose places where's at least partial coverage already.
    If you'd like to contribute and update a network with the network Wikidata tag, I prepared to quick steps to do such an edit with the JOSM OpenStreetMap desktop editor.
    Download a set of objects to update using an Overpass query, as an example, selecting the stations of Washing DC metro

    [out:xml][timeout:90][bbox:{{bbox}}];

    (

    nwr["network"="Washington Metro"]["railway"="station"];

    );

    (._;>;);

    out meta;

    josm-download-overpass-query.png
    JOSM Overpass download query editor

    Select the region to download from

    josm-select-region.png
    Select region in JOSM

    Select to only show the datalayer (not showing the background map) to make it easier to see the raw data.

    josm-show-layers.png
    Toggle data layers in JOSM

    Select the nodes.

    josm-select-nodes.png
    Show raw datapoints in JSOM

    Edit the field in the tag edit panel to update the value for all selected objects

    josm-edit-network-wikidata-tag.png
    Showing tags for selected objects

    Note that this sample assumed the relevant station node where already tagged with network names (the network tag). Other queries to limit selection might be needed.

    Also it could also be a good idea to reach out to local OSM communities before making bulk edits like this (e.g. if there is no such tagging at all in specific region) to make sure it would be aliged with expectations and such.

    Then it will also potentially take a while before it gets include in out monthly vector tile  update.

    When this has been done, given a suitable icon is available as e.g. public domain or commons in WikimediaCommons, it could be bundled in data/icons/stations and a definition added in the data mapping in src/mapStyle/stations.js.

    And More…

    One feature that has been long-wanted is the ability to dowload maps for offline usage. Lately precisely this is something James Westman has been working on.

    It's still an early draft, so we'll see when it is ready, but it already look pretty promising.

    hamburger-preferences.png
    Showing the new Preferences option



    download-dialog.png
    Preference dialog with dowloads

    download-select-area.png
    Selecting region to download

    download-rename.png
    Entering a name for a downloaded region

    download-areas.png
    Dialog showing dowloaded areas

    And that's it for now!

    • Pl chevron_right

      Michael Meeks: 2025-06-18 Wednesday

      news.movim.eu / PlanetGnome • 18 June

    • Up too early, out for a run with J. Sync with Dave. Plugged away at calls, admin, partner call, sales call, catch up with Philippe and Italo.
    • Birthday presents at lunch - new (identical) trousers, and a variable DC power supply for some electronics.
    • Published the next strip around the excitement of setting up your own non-profit structure:
    • Pl chevron_right

      Alley Chaggar: Demystifying The Codegen Phase Part 1

      news.movim.eu / PlanetGnome • 18 June • 3 minutes

    Intro

    I want to start off and say I’m really glad that my last blog was helpful to many wanting to understand Vala’s compiler. I hope this blog will also be just as informative and helpful. I want to talk a little about the basics of the compiler again, but this time, catering to the codegen phase. The phase that I’m actually working on, but has the least information in the Vala Docs.

    Last blog, I briefly mentioned the directories codegen and ccode being part of the codegen phase. This blog will be going more into depth about it. The codegen phase takes the AST and outputs the C code tree (ccode* objects), so that it can be generated to C code more easily, usually by GCC or another C compiler you installed. When dealing with this phase, it’s really beneficial to know and understand at least a little bit of C.

    ccode Directory

    • Many of the files in the ccode directory are derived from the class CCodeNode, valaccodenode.vala .
    • The files in this directory represent C Constructs. For example, the valaccodefunction.vala file represents a C code function. Regular C functions have function names, parameters, return types, and bodies that add logic. Essentially, what this class specifically does, is provide the building blocks for building a function in C.

      Cfunction3.png

         //...
        	writer.write_string (return_type);
            if (is_declaration) {
                writer.write_string (" ");
            } else {
                writer.write_newline ();
            }
            writer.write_string (name);
            writer.write_string (" (");
            int param_pos_begin = (is_declaration ? return_type.char_count () + 1 : 0 ) + name.char_count () + 2;
      
            bool has_args = (CCodeModifiers.PRINTF in modifiers || CCodeModifiers.SCANF in modifiers);
       //...
      

    This code snippet is part of the ccodefunction file, and what it’s doing is overriding the ‘write’ function that is originally from ccodenode. It’s actually writing out the C function.

    codegen Directory

    • The files in this directory are higher-level components responsible for taking the compiler’s internal representation, such as the AST and transforming it into the C code model ccode objects.
    • Going back to the example of the ccodefunction, codegen will take a function node from the abstract syntax tree (AST), and will create a new ccodefunction object. It then fills this object with information like the return type, function name, parameters, and body, which are all derived from the AST. Then the CCodeFunction.write() (the code above) will generate and write out the C function.

      //...
      private void add_get_property_function (Class cl) {
      		var get_prop = new CCodeFunction ("_vala_%s_get_property".printf (get_ccode_lower_case_name (cl, null)), "void");
      		get_prop.modifiers = CCodeModifiers.STATIC;
      		get_prop.add_parameter (new CCodeParameter ("object", "GObject *"));
      		get_prop.add_parameter (new CCodeParameter ("property_id", "guint"));
      		get_prop.add_parameter (new CCodeParameter ("value", "GValue *"));
      		get_prop.add_parameter (new CCodeParameter ("pspec", "GParamSpec *"));
        
      		push_function (get_prop);
      //...
      

    This code snippet is from valagobjectmodule.vala and it’s calling CCodeFunction (again from the valaccodefunction.vala ) and adding the parameters, which is calling valaccodeparameter.vala . What this would output is something that looks like this in C:

        void _vala_get_property (GObject *object, guint property_id, GValue *value, GParamSpec *pspec) {
           //... 
        }
    

    Why do all this?

    Now you might ask why? Why separate codegen and ccode?

    • We split things into codegen and ccode to keep the compiler organized, readable, and maintainable. It prevents us from having to constantly write C code representations from scratch all the time.
    • It also reinforces the idea of polymorphism and the ability that objects can behave differently depending on their subclass.
    • And it lets us do hidden generation by adding new helper functions, temporary variables, or inlined optimizations after the AST and before the C code output.

    Jsonmodule

    I’m happy to say that I am making a lot of progress with the JSON module I mentioned last blog. The JSON module follows very closely other modules in the codegen, specifically like the gtk module and the gobject module . It will be calling ccode functions to make ccode objects and creating helper methods so that the user doesn’t need to manually override certain JSON methods.

    • Pl chevron_right

      Jamie Gravendeel: UI-First Search With List Models

      news.movim.eu / PlanetGnome • 18 June • 7 minutes

    When managing large amounts of data, manual widget creation finds its limits. Not only because managing both data and UI separately is tedious, but also because performance will be a real concern.

    Luckily, there’s two solutions for this in GTK:

    1. Gtk.ListView using a factory: more performant since it reuses widgets when the list gets long
    2. Gtk.ListBox ‘s bind_model() : less performant, but can use boxed list styling

    This blog post provides an example of a Gtk.ListView containing my pets, which is sorted, can be searched, and is primarily made in Blueprint.

    The app starts with a plain window:

    from gi.repository import Adw, Gtk
    
    
    @Gtk.Template.from_resource("/app/example/Pets/window.ui")
    class Window(Adw.ApplicationWindow):
        """The main window."""
    
        __gtype_name__ = "Window"
    
    using Gtk 4.0;
    using Adw 1;
    
    template $Window: Adw.ApplicationWindow {
      title: _("Pets");
      default-width: 450;
      default-height: 450;
    
      content: Adw.ToolbarView {
        [top]
        Adw.HeaderBar {}
      }
    }
    

    Data Object

    The Gtk.ListView needs a data object to work with, which in this example is a pet with a name and species.

    This requires a GObject.Object called Pet with those properties, and a GObject.GEnum called Species :

    from gi.repository import Adw, GObject, Gtk
    
    
    class Species(GObject.GEnum):
        """The species of an animal."""
    
        NONE = 0
        CAT = 1
        DOG = 2
    
    […]
    
    class Pet(GObject.Object):
        """Data for a pet."""
    
        __gtype_name__ = "Pet"
    
        name = GObject.Property(type=str)
        species = GObject.Property(type=Species, default=Species.NONE)
    

    List View

    Now that there’s a data object to work with, the app needs a Gtk.ListView with a factory and model.

    To start with, there’s a Gtk.ListView wrapped in a Gtk.ScrolledWindow to make it scrollable, using the .navigation-sidebar style class for padding:

    content: Adw.ToolbarView {
      […]
    
      content: ScrolledWindow {
        child: ListView {
          styles [
            "navigation-sidebar",
          ]
        };
      };
    };
    

    Factory

    The factory builds a Gtk.ListItem for each object in the model, and utilizes bindings to show the data in the Gtk.ListItem :

    content: ListView {
      […]
    
      factory: BuilderListItemFactory {
        template ListItem {
          child: Label {
            halign: start;
            label: bind template.item as <$Pet>.name;
          };
        }
      };
    };

    Model

    Models can be modified through nesting. The data itself can be in any Gio.ListModel , in this case a Gio.ListStore works well.

    The Gtk.ListView expects a Gtk.SelectionModel because that’s how it manages its selection, so the Gio.ListStore is wrapped in a Gtk.NoSelection :

    using Gtk 4.0;
    using Adw 1;
    using Gio 2.0;
    
    […]
    
    content: ListView {
      […]
    
      model: NoSelection {
        model: Gio.ListStore {
          item-type: typeof<$Pet>;
    
          $Pet {
            name: "Herman";
            species: cat;
          }
    
          $Pet {
            name: "Saartje";
            species: dog;
          }
    
          $Pet {
            name: "Sofie";
            species: dog;
          }
    
          $Pet {
            name: "Rex";
            species: dog;
          }
    
          $Pet {
            name: "Lady";
            species: dog;
          }
    
          $Pet {
            name: "Lieke";
            species: dog;
          }
    
          $Pet {
            name: "Grumpy";
            species: cat;
          }
        };
      };
    };
    

    Sorting

    To easily parse the list, the pets should be sorted by both name and species.

    To implement this, the Gio.ListStore has to be wrapped in a Gtk.SortListModel which has a Gtk.MultiSorter with two sorters, a Gtk.NumericSorter and a Gtk.StringSorter .

    Both of these need an expression: the property that needs to be compared.

    The Gtk.NumericSorter expects an integer, not a Species , so the app needs a helper method to convert it:

    class Window(Adw.ApplicationWindow):
        […]
    
        @Gtk.Template.Callback()
        def _species_to_int(self, _obj: Any, species: Species) -> int:
            return int(species)
    
    model: NoSelection {
      model: SortListModel {
        sorter: MultiSorter {
          NumericSorter {
            expression: expr $_species_to_int(item as <$Pet>.species) as <int>;
          }
    
          StringSorter {
            expression: expr item as <$Pet>.name;
          }
        };
    
        model: Gio.ListStore { […] };
      };
    };
    

    To learn more about closures, such as the one used in the Gtk.NumericSorter , consider reading my previous blog post .

    Search

    To look up pets even faster, the user should be able to search for them by both their name and species.

    Filtering

    First, the Gtk.ListView ‘s model needs the logic to filter the list by name or species.

    This can be done with a Gtk.FilterListModel which has a Gtk.AnyFilter with two Gtk.StringFilter s.

    One of the Gtk.StringFilter s expects a string, not a Species , so the app needs another helper method to convert it:

    class Window(Adw.ApplicationWindow):
        […]
    
        @Gtk.Template.Callback()
        def _species_to_string(self, _obj: Any, species: Species) -> str:
            return species.value_nick
    
    model: NoSelection {
      model: FilterListModel {
        filter: AnyFilter {
          StringFilter {
            expression: expr item as <$Pet>.name;
          }
    
          StringFilter {
            expression: expr $_species_to_string(item as <$Pet>.species) as <string>;
          }
        };
    
        model: SortListModel { […] };
      };
    };
    

    Entry

    To actually search with the filters, the app needs a Gtk.SearchBar with a Gtk.SearchEntry .

    The Gtk.SearchEntry ‘s text property needs to be bound to the Gtk.StringFilter s’ search properties to filter the list on demand.

    To be able to start searching by typing from anywhere in the window, the Gtk.SearchEntry ‘s key-capture-widget has to be set to the window, in this case the template itself:

    content: Adw.ToolbarView {
      […]
    
      [top]
      SearchBar {
        key-capture-widget: template;
    
        child: SearchEntry search_entry {
          hexpand: true;
          placeholder-text: _("Search pets");
        };
      }
    
      content: ScrolledWindow {
        child: ListView {
          […]
    
          model: NoSelection {
            model: FilterListModel {
              filter: AnyFilter {
                StringFilter {
                  search: bind search_entry.text;
                  […]
                }
    
                StringFilter {
                  search: bind search_entry.text;
                  […]
                }
              };
    
              model: SortListModel { […] };
            };
          };
        };
      };
    };
    

    Toggle Button

    The Gtk.SearchBar should also be toggleable with a Gtk.ToggleButton .

    To do so, the Gtk.SearchEntry ‘s search-mode-enabled property should be bidirectionally bound to the Gtk.ToggleButton ‘s active property:

    content: Adw.ToolbarView {
      [top]
      Adw.HeaderBar {
        [start]
        ToggleButton search_button {
          icon-name: "edit-find-symbolic";
          tooltip-text: _("Search");
        }
      }
    
      [top]
      SearchBar {
        search-mode-enabled: bind search_button.active bidirectional;
        […]
      }
    
      […]
    };
    

    The search_button should also be toggleable with a shortcut, which can be added with a Gtk.ShortcutController :

    [start]
    ToggleButton search_button {
      […]
    
      ShortcutController {
        scope: managed;
    
        Shortcut {
          trigger: "<Control>f";
          action: "activate";
        }
      }
    }
    

    Empty State

    Last but not least, the view should fall back to an Adw.StatusPage if there are no search results.

    This can be done with a closure for the visible-child-name property in an Adw.ViewStack or Gtk.Stack . I generally prefer an Adw.ViewStack due to its animation curve.

    The closure takes the amount of items in the Gtk.NoSelection as input, and returns the correct Adw.ViewStackPage name:

    class Window(Adw.ApplicationWindow):
        […]
    
        @Gtk.Template.Callback()
        def _get_visible_child_name(self, _obj: Any, items: int) -> str:
            return "content" if items else "empty"
    
    content: Adw.ToolbarView {
      […]
    
      content: Adw.ViewStack {
        visible-child-name: bind $_get_visible_child_name(selection_model.n-items) as <string>;
        enable-transitions: true;
    
        Adw.ViewStackPage {
          name: "content";
    
          child: ScrolledWindow {
            child: ListView {
              […]
    
              model: NoSelection selection_model { […] };
            };
          };
        }
    
        Adw.ViewStackPage {
          name: "empty";
    
          child: Adw.StatusPage {
            icon-name: "edit-find-symbolic";
            title: _("No Results Found");
            description: _("Try a different search");
          };
        }
      };
    };
    

    End Result

    from typing import Any
    
    from gi.repository import Adw, GObject, Gtk
    
    
    class Species(GObject.GEnum):
        """The species of an animal."""
    
        NONE = 0
        CAT = 1
        DOG = 2
    
    
    @Gtk.Template.from_resource("/org/example/Pets/window.ui")
    class Window(Adw.ApplicationWindow):
        """The main window."""
    
        __gtype_name__ = "Window"
    
        @Gtk.Template.Callback()
        def _get_visible_child_name(self, _obj: Any, items: int) -> str:
            return "content" if items else "empty"
    
        @Gtk.Template.Callback()
        def _species_to_string(self, _obj: Any, species: Species) -> str:
            return species.value_nick
    
        @Gtk.Template.Callback()
        def _species_to_int(self, _obj: Any, species: Species) -> int:
            return int(species)
    
    
    class Pet(GObject.Object):
        """Data about a pet."""
    
        __gtype_name__ = "Pet"
    
        name = GObject.Property(type=str)
        species = GObject.Property(type=Species, default=Species.NONE)
    
    using Gtk 4.0;
    using Adw 1;
    using Gio 2.0;
    
    template $Window: Adw.ApplicationWindow {
      title: _("Pets");
      default-width: 450;
      default-height: 450;
    
      content: Adw.ToolbarView {
        [top]
        Adw.HeaderBar {
          [start]
          ToggleButton search_button {
            icon-name: "edit-find-symbolic";
            tooltip-text: _("Search");
    
            ShortcutController {
              scope: managed;
    
              Shortcut {
                trigger: "f";
                action: "activate";
              }
            }
          }
        }
    
        [top]
        SearchBar {
          key-capture-widget: template;
          search-mode-enabled: bind search_button.active bidirectional;
    
          child: SearchEntry search_entry {
            hexpand: true;
            placeholder-text: _("Search pets");
          };
        }
    
        content: Adw.ViewStack {
          visible-child-name: bind $_get_visible_child_name(selection_model.n-items) as ;
          enable-transitions: true;
    
          Adw.ViewStackPage {
            name: "content";
    
            child: ScrolledWindow {
              child: ListView {
                styles [
                  "navigation-sidebar",
                ]
    
                factory: BuilderListItemFactory {
                  template ListItem {
                    child: Label {
                      halign: start;
                      label: bind template.item as <$Pet>.name;
                    };
                  }
                };
    
                model: NoSelection selection_model {
                  model: FilterListModel {
                    filter: AnyFilter {
                      StringFilter {
                        expression: expr item as <$Pet>.name;
                        search: bind search_entry.text;
                      }
    
                      StringFilter {
                        expression: expr $_species_to_string(item as <$Pet>.species) as <string>;
                        search: bind search_entry.text;
                      }
                    };
    
                    model: SortListModel {
                      sorter: MultiSorter {
                        NumericSorter {
                          expression: expr $_species_to_int(item as <$Pet>.species) as <int>;
                        }
    
                        StringSorter {
                          expression: expr item as <$Pet>.name;
                        }
                      };
    
                      model: Gio.ListStore {
                        item-type: typeof<$Pet>;
    
                        $Pet {
                          name: "Herman";
                          species: cat;
                        }
    
                        $Pet {
                          name: "Saartje";
                          species: dog;
                        }
    
                        $Pet {
                          name: "Sofie";
                          species: dog;
                        }
    
                        $Pet {
                          name: "Rex";
                          species: dog;
                        }
    
                        $Pet {
                          name: "Lady";
                          species: dog;
                        }
    
                        $Pet {
                          name: "Lieke";
                          species: dog;
                        }
    
                        $Pet {
                          name: "Grumpy";
                          species: cat;
                        }
                      };
                    };
                  };
                };
              };
            };
          }
    
          Adw.ViewStackPage {
            name: "empty";
    
            child: Adw.StatusPage {
              icon-name: "edit-find-symbolic";
              title: _("No Results Found");
              description: _("Try a different search");
            };
          }
        };
      };
    }
    

    List models are pretty complicated, but I hope that this example provides a good idea of what’s possible from Blueprint, and is a good stepping stone to learn more.

    Thanks for reading!

    PS: a shout out to Markus for guessing what I’d write about next ;)

    • Pl chevron_right

      Hari Rana: It’s True, “We” Don’t Care About Accessibility on Linux

      news.movim.eu / PlanetGnome • 18 June • 10 minutes

    Introduction

    What do concern trolls and privileged people without visible or invisible disabilities who share or make content about accessibility on Linux being trash without contributing anything to projects have in common? They don’t actually really care about the group they’re defending; they just exploit these victims’ unfortunate situation to fuel hate against groups and projects actually trying to make the world a better place.

    I never thought I’d be this upset to a point I’d be writing an article about something this sensitive with a clickbait-y title. It’s simultaneously demotivating, unproductive, and infuriating. I’m here writing this post fully knowing that I could have been working on accessibility in GNOME, but really, I’m so tired of having my mood ruined because of privileged people spending at most 5 minutes to write erroneous posts and then pretending to be oblivious when confronted while it takes us 5 months of unpaid work to get a quarter of recognition, let alone acknowledgment, without accounting for the time “wasted” addressing these accusations. This is far from the first time, and it will certainly not be the last.

    I’m Not Angry

    I’m not mad. I’m absolutely furious and disappointed in the Linux Desktop community for being quiet in regards to any kind of celebration to advancing accessibility, while proceeding to share content and cheer for random privileged people from big-name websites or social media who have literally put a negative amount of effort into advancing accessibility on Linux. I’m explicitly stating a negative amount because they actually make it significantly more stressful for us.

    None of this is fair. If you’re the kind of person who stays quiet when we celebrate huge accessibility milestones, yet shares (or even makes) content that trash talks the people directly or indirectly contributing to the fucking software you use for free, you are the reason why accessibility on Linux is shit.

    No one in their right mind wants to volunteer in a toxic environment where their efforts are hardly recognized by the public and they are blamed for “not doing enough”, especially when they are expected to take in all kinds of harassment, nonconstructive criticism, and slander for a salary of 0$.

    There’s only one thing I am shamefully confident about: I am not okay in the head. I shouldn’t be working on accessibility anymore. The recognition-to-smearing ratio is unbearably low and arguably unhealthy, but leaving people in unfortunate situations behind is also not in accordance with my values.

    I’ve been putting so much effort, quite literally hundreds of hours, into:

    1. thinking of ways to come up with inclusive designs and experiences;
    2. imagining how I’d use something if I had a certain disability or condition;
    3. asking for advice and feedback from people with disabilities;
    4. not getting paid from any company or organization; and
    5. making sure that all the accessibility-related work is in the public, and stays in the public .

    Number 5 is especially important to me. I personally go as far as to refuse to contribute to projects under a permissive license , and/or that utilize a contributor license agreement , and/or that utilize anything riskily similar to these two, because I am of the opinion that no amount of code for accessibility should either be put under a paywall or be obscured and proprietary .

    Permissive licenses make it painlessly easy for abusers to fork, build an ecosystem on top of it which may include accessibility-related improvements, slap a price tag alongside it, all without publishing any of these additions/changes. Corporations have been doing that for decades, and they’ll keep doing it until there’s heavy push back. The only time I would contribute to a project under a permissive license is when the tool is the accessibility infrastructure itself. Contributor license agreements are significantly worse in that regard , so I prefer to avoid them completely.

    The Truth Nobody Is Telling You

    KDE hired a legally blind contractor to work on accessibility throughout the KDE ecosystem , including complying with the EU Directive to allow selling hardware with Plasma .

    GNOME’s new executive director, Steven Deobald, is partially blind .

    The GNOME Foundation has been investing a lot of money to improve accessibility on Linux, for example funding Newton, a Wayland accessibility project and AccessKit integration into GNOME technologies . Around 250,000€ (1/4) of the STF budget was spent solely on accessibility. And get this: literally everybody managing these contracts and communication with funders are volunteers; they’re ensuring people with disabilities earn a living, but aren’t receiving anything in return . These are the real heroes who deserve endless praise.

    The Culprits

    Do you want to know who we should be blaming? Profiteers who are profiting from the community’s effort while investing very little to nothing into accessibility.

    This includes a significant portion of the companies sponsoring GNOME and even companies that employ developers to work on GNOME. These companies are the ones making hundreds of millions, if not billions, in net profit indirectly from GNOME (and other free and open-source projects), and investing little to nothing into them. However, the worst offenders are the companies actively using GNOME without ever donating anything to fund the projects.

    Some companies actually do put an effort, like Red Hat and Igalia. Red Hat employs people with disabilities to work on accessibility in GNOME, one of which I actually rely on when making accessibility-related contributions in GNOME. Igalia funds Orca, the screen reader as part of GNOME, which is something the Linux community should be thankful of. However, companies have historically invested what’s necessary to comply with governments’ accessibility requirements, and then never invest in it again.

    The privileged people who keep sharing and making content around accessibility on Linux being bad without contributing anything to it are, in my opinion, significantly worse than the companies profiting off of GNOME. Companies are and stay quiet, but these privileged people add an additional burden to contributors by either trash talking or sharing trash talkers. Once again, no volunteer deserves to be in the position of being shamed and ridiculed for “not doing enough”, since no one is entitled to their free time, but themselves.

    My Work Is Free but the Worth Is Not

    Earlier in this article, I mentioned, and I quote: “I’ve been putting so much effort, quite literally hundreds of hours […]”. Let’s put an emphasis on “hundreds”. Here’s a list of most accessibility-related merge requests that have been incorporated into GNOME:

    GNOME Calendar’s !559 addresses an issue where event widgets were unable to be focused and activated by the keyboard. That was present since the very beginning of GNOME Calendar’s existence , to be specific: for more than a decade. This alone was was a two-week effort. Despite it being less than 100 lines of code, nobody truly knew what to do to have them working properly before. This was followed up by !576 , which made the event buttons usable in the month view with a keyboard , and then !587 , which properly conveys the states of the widgets. Both combined are another two-week effort.

    Then, at the time of writing this article, !564 adds 640 lines of code, which is something I’ve been volunteering on for more than a month, excluding the time before I opened the merge request.

    Let’s do a little bit of math together with ‘only’ !559 , !576 , and !587 . Just as a reminder: these three merge requests are a four-week effort in total, which I volunteered full-time—8 hours a day, or 160 hours a month. I compiled a small table that illustrates its worth:

    Country Average Wage for Professionals Working on Digital Accessibility WebAIM Total in Local Currency
    (160 hours)
    Exchange Rate Total (CAD)
    Canada 58.71$ CAD/hour 9,393.60$ CAD N/A 9,393.60$
    United Kingdom 48.20£ GBP/hour 7,712£ GBP 1.8502 14,268.74$
    United States of America 73.08$ USD/hour 11,692.80$ USD 1.3603 15,905.72$

    To summarize the table: those three merge requests that I worked on for free were worth 9,393.60$ CAD (6,921.36$ USD) in total at a minimum .

    Just a reminder:

    • these merge requests exclude the time spent to review the submitted code;
    • these merge requests exclude the time I spent testing the code;
    • these merge requests exclude the time we spent coordinating these milestones;
    • these calculations exclude the 30+ merge requests submitted to GNOME; and
    • these calculations exclude the merge requests I submitted to third-party GNOME-adjacent apps.

    Now just imagine how I feel when I’m told I’m “not doing enough”, either directly or indirectly, by privileged people who don’t rely on any of these accessibility features. Whenever anybody says we’re “not doing enough”, I feel very much included, and I will absolutely take it personally.

    It All Trickles Down to “GNOME Bad”

    I fully expect everything I say in this article to be dismissed or be taken out of context on the basis of ad hominem , simply by the mere fact I’m a GNOME Foundation member / regular GNOME contributor. Either that, or be subject to whataboutism because another GNOME contributor made a comment that had nothing to do with mine but ‘is somewhat related to this topic and therefore should be pointed out just because it was maybe-probably-possibly-perhaps ableist’ . I can’t speak for other regular contributors, but I presume that they don’t feel comfortable talking about this because they dared be a GNOME contributor. At least, that’s how I felt for the longest time.

    Any content related to accessibility that doesn’t dunk on GNOME doesn’t foresee as many engagement, activity, and reaction as content that actively attacks GNOME, regardless of whether the criticism is fair. Many of these people don’t even use these accessibility features; they’re just looking for every opportunity to say “GNOME bad” and will 🪄 magically 🪄 start caring about accessibility.

    Regular GNOME contributors like myself don’t always feel comfortable defending ourselves because dismissing GNOME developers just for being GNOME developers is apparently a trend…

    Final Word

    Dear people with disabilities,

    I won’t insist that we’re either your allies or your enemies—I have no right to claim that whatsoever.

    I wasn’t looking for recognition. I wasn’t looking for acknowledgment since the very beginning either. I thought I would be perfectly capable of quietly improving accessibility in GNOME, but because of the overall community’s persistence to smear developers’ efforts without actually tackling the underlying issues within the stack, I think I’ve justified myself to at least demand for acknowledgment from the wider community.

    I highly doubt it will happen anyway, because the Linux community feeds off of drama and trash talking instead of being productive, without realizing that it negatively demotivates active contributors while pushing away potential contributors. And worst of all: people with disabilities are the ones affected the most because they are misled into thinking that we don’t care.

    It’s so unfair and infuriating that all the work I do and share online gain very little activity compared to random posts and articles from privileged people without disabilities that rant about the Linux desktop’s accessibility being trash. It doesn’t help that I become severely anxious sharing accessibility-related work to avoid signs of virtue signalling . The last thing I want is to (unintentionally) give any sign and impression of pretending to care about accessibility.

    I beg you, please keep writing banger posts like fireborn’s I Want to Love Linux. It Doesn’t Love Me Back series and their interluding post . We need more people with disabilities to keep reminding developers that you exist and your conditions and disabilities are a spectrum, and not absolute .

    We simultaneously need more interest from people with disabilities to contribute to free and open-source software, and the wider community to be significantly more intolerant of bullies who profit from smearing and demotivating people who are actively trying.

    We should take inspiration from “ Accessibility on Linux sucks, but GNOME and KDE are making progress ” by OSNews. They acknowledge that accessibility on Linux is suboptimal while recognizing the efforts of GNOME and KDE. As a community, we should promote progress more often.

    • Pl chevron_right

      Matthew Garrett: Locally hosting an internet-connected server

      news.movim.eu / PlanetGnome • 17 June • 4 minutes

    I'm lucky enough to have a weird niche ISP available to me, so I'm paying $35 a month for around 600MBit symmetric data. Unfortunately they don't offer static IP addresses to residential customers, and nor do they allow multiple IP addresses per connection, and I'm the sort of person who'd like to run a bunch of stuff myself, so I've been looking for ways to manage this.

    What I've ended up doing is renting a cheap VPS from a vendor that lets me add multiple IP addresses for minimal extra cost. The precise nature of the VPS isn't relevant - you just want a machine (it doesn't need much CPU, RAM, or storage) that has multiple world routeable IPv4 addresses associated with it and has no port blocks on incoming traffic. Ideally it's geographically local and peers with your ISP in order to reduce additional latency, but that's a nice to have rather than a requirement.

    By setting that up you now have multiple real-world IP addresses that people can get to. How do we get them to the machine in your house you want to be accessible? First we need a connection between that machine and your VPS, and the easiest approach here is Wireguard . We only need a point-to-point link, nothing routable, and none of the IP addresses involved need to have anything to do with any of the rest of your network. So, on your local machine you want something like:

    [Interface]
    PrivateKey = privkeyhere
    ListenPort = 51820
    Address = localaddr/32

    [Peer]
    Endpoint = VPS:51820
    PublicKey = pubkeyhere
    AllowedIPs = VPS/0


    And on your VPS, something like:

    [Interface]
    Address = vpswgaddr/32
    SaveConfig = true
    ListenPort = 51820
    PrivateKey = privkeyhere

    [Peer]
    PublicKey = pubkeyhere
    AllowedIPs = localaddr/32


    The addresses here are (other than the VPS address) arbitrary - but they do need to be consistent, otherwise Wireguard is going to be unhappy and your packets will not have a fun time. Bring that interface up with wg-quick and make sure the devices can ping each other. Hurrah! That's the easy bit.

    Now you want packets from the outside world to get to your internal machine. Let's say the external IP address you're going to use for that machine is 321.985.520.309 and the wireguard address of your local system is 867.420.696.005 . On the VPS, you're going to want to do:

    iptables -t nat -A PREROUTING -p tcp -d 321.985.520.309 -j DNAT --to-destination 867.420.696.005

    Now, all incoming packets for 321.985.520.309 will be rewritten to head towards 867.420.696.005 instead (make sure you've set net.ipv4.ip_forward to 1 via sysctl !). Victory! Or is it? Well, no.

    What we're doing here is rewriting the destination address of the packets so instead of heading to an address associated with the VPS, they're now going to head to your internal system over the Wireguard link. Which is then going to ignore them, because the AllowedIPs statement in the config only allows packets coming from your VPS, and these packets still have their original source IP. We could rewrite the source IP to match the VPS IP, but then you'd have no idea where any of these packets were coming from, and that sucks. Let's do something better. On the local machine, in the peer, let's update AllowedIps to 0.0.0.0/0 to permit packets form any source to appear over our Wireguard link. But if we bring the interface up now, it'll try to route all traffic over the Wireguard link, which isn't what we want. So we'll add table = off to the interface stanza of the config to disable that, and now we can bring the interface up without breaking everything but still allowing packets to reach us. However, we do still need to tell the kernel how to reach the remote VPN endpoint, which we can do with ip route add vpswgaddr dev wg0 . Add this to the interface stanza as:

    PostUp = ip route add vpswgaddr dev wg0
    PreDown = ip route del vpswgaddr dev wg0


    That's half the battle. The problem is that they're going to show up there with the source address still set to the original source IP, and your internal system is (because Linux) going to notice it has the ability to just send replies to the outside world via your ISP rather than via Wireguard and nothing is going to work. Thanks, Linux. Thinux.

    But there's a way to solve this - policy routing. Linux allows you to have multiple separate routing tables, and define policy that controls which routing table will be used for a given packet. First, let's define a new table reference. On the local machine, edit /etc/iproute2/rt_tables and add a new entry that's something like:

    1 wireguard


    where "1" is just a standin for a number not otherwise used there. Now edit your wireguard config and replace table=off with table=wireguard - Wireguard will now update the wireguard routing table rather than the global one. Now all we need to do is to tell the kernel to push packets into the appropriate routing table - we can do that with ip rule add from localaddr lookup wireguard , which tells the kernel to take any packet coming from our Wireguard address and push it via the Wireguard routing table. Add that to your Wireguard interface config as:

    PostUp = ip rule add from localaddr lookup wireguard
    PreDown = ip rule del from localaddr lookup wireguard

    and now your local system is effectively on the internet.

    You can do this for multiple systems - just configure additional Wireguard interfaces on the VPS and make sure they're all listening on different ports. If your local IP changes then your local machines will end up reconnecting to the VPS, but to the outside world their accessible IP address will remain the same. It's like having a real IP without the pain of convincing your ISP to give it to you.

    comment count unavailable comments