Semantic MediaWiki 4.2.0 released

Thursday, 18 July 2024 13:37 UTC

July 18, 2024

Semantic MediaWiki 4.2.0 (SMW 4.2.0) has been released today as a new version of Semantic MediaWiki.

It is a feature release that brings a faceted search interface (Special:FacetedSearch) and adds the source parameter to the "ask" and "askargs" API modules. Compatibility was added for MediaWiki 1.40.x and 1.4.x as well as PHP 8.2.x. It also contains maintenance and translation updates for system messages. Please refer to the help pages on installing or upgrading Semantic MediaWiki to get detailed instructions on how to do this.

A 3-day Hackathon was held from June 5th – 7th in Goma, DRCongo and brought together a highly technical group of enthusiastic participants from around the city to learn, collaborate and build an open source product using Wikimedia APIs from scratch. 

The hackathon was guided by three core objectives:

  • Create Awareness: it’s not a secret for all of us. Onboarding for francophone developers in Wikimedia Technical space is not so easy. The hackathon was an opportunity for francophone developers to learn how to smoothly start with open source, thanks to Wikimedia.
  • Innovate and Build: we wanted to build something from scratch; but not only that we wanted to innovate. Emi is not only another “top visited articles” app, but one that gives info about how a given country access or visit Wikipedia articles. And there’re cool features yet to come.
  • Connect and Collaborate: it’s all about community. Connect and collaborate, these are things we always want to have in all physical event. The program was designed in a way that allows participants to network. 

Participant shared their impressions with us. Take their words for it.

Participer à l’hackathon Wikimedia à Goma a été une expérience transformatrice. J’ai non seulement acquis de nouvelles compétences techniques et collaboratives, mais j’ai aussi été témoin de la puissance de la communauté et de l’importance de l’open source pour le développement local. Je repars avec une motivation renouvelée pour contribuer aux projets Wikimedia et pour continuer à promouvoir l’accès libre à la connaissance. –User:Esaïe Bahati

I’m positively impressed with the event. It was a great experience of learning and working together. The commitment of the organizers to help the community has inspired me a lot and I’ve understood that we need to keep the cycle going by raising the awareness of others to participate in open source projects. I cannot finish my impression without mentioning that I really liked the simplicity of the organizers, they were ready to answer any question in a friendly way. It was almost impossible to recognize them because they were just other contributors like us. Thank you very much for this opportunity – User:Emmanuelbinen

Par ce Hackathon, j’ai appris comment travailler sous pression et encore en équipe avec une collaboration entre membres de l’équipe. Quelques points qu’on pourrait améliorer c’est la répartition de tâches à chaque participant. Sinon l’idée générale était nickel et j’aimerais de nouveau participer à ce genre d’activité, merci à toute la team – User:KOLIAMA Nathanaël

Program Highlights

  • We welcomed 40 participants, including six women. Participants came mostly from Wikimedians of DRCongo User Group, Kali Academy, and other coding bootcamps or universities. Attendees were split into two groups:
    • Documnetors (10)
    • Coders (30)
  • Most of our participant reported that the knew already something about open source and have done open source contributions
  • About MediaWiki, participants reported that they have known it and some have done contributions to it
  • Around 190 commits and more 2000 lines of code were pushed to GitHub
Hackathon in Goma, DRCongo June 2024, BamLifa

Conclusion, Recognitions and Next Step

Pulling off a successful event truly takes a village, and we had a talented village! 😀 A heartfelt thank you to our incredible Core Team Wikimedia DRC (Valentin, Gloire, Cedrik, Josué Joe Makuta, Credo), Kali Academy (BamLifa, Delord), Sponsor (Wiki Mentor Africa, Igbo Wikimedians User Group) who all worked tirelessly to keep everything on track.

A special shoutout also goes to all our participants who contributed to the success of the event. We can’t finish this thank-you section without saying a special thanks to Benedict Udeh for making the collaboration come to life.

The hackathon was a great opportunity given to francophone developers to put the feet wet into open source thanks to Wikimedia with a smooth onboarding process. We’re proud of the product, Emi, that was developed and the connections made. We look forward to seeing many other developers contribute to the project.

Next step: the app is still in alpha version. We want to spread the world about it and call for other contributors. We also plan to migrate the app to Wikimedia tools like Phabricator, Toolforge…

On June 11, 2024,  Code for Africa’s African Wikipedia Alliance (AWA), hosted a monthly meetup for its Francophone community titled “Introduction à Wikimedia Commons.” The online meetup was facilitated by  Azogbonon Constant, a member of the Africa Wikipedia Alliance Francophone Community and Founder of the Réseau @pprendre (Learning Network). This online training session was moderated by Dalilah Yaro, Community Coordinator COPP, with backend support from Bukola James, the Community Coordinator of AWA. The session had 14 participants, including project leads and volunteer Wikimedians from Benin and Burundi. It provided participants with an introduction to the basics of Wikimedia Commons, information about the types of Creative Commons licenses on Commons, guidance on how to upload images to Commons, and details about the types of campaigns participants can contribute to on Wikimedia Commons.

Key Highlights

The webinar commenced with a summary of the importance of Wikimedia Commons as a repository for free-use media files and its significance for various Wikimedia projects. Participants learned about the different types of Creative Commons licenses, their implications, and the proper usage and attribution of media files. This foundational knowledge is crucial for anyone looking to contribute effectively to Wikimedia Commons and ensure their media files are appropriately licensed and attributed.

Practical Techniques and Campaign Participation

During the session, Azogbonon demonstrated practical uploading techniques for media files, offering step-by-step instructions on how to navigate the Commons platform. He explored various Wikimedia Commons campaigns, encouraging participants to get involved in their local communities. By participating in these campaigns, contributors can help enrich Wikimedia Commons with diverse and culturally significant media.

He also provided guidance on effective ways to contribute to Commons and addressed common challenges faced by new contributors, such as understanding copyright rules and selecting appropriate licenses for their media files. These insights are invaluable for ensuring that new contributors can navigate the complexities of copyright and licensing with confidence.

Recommendations for Continued Engagement

Azogbonon emphasised the importance of continuous contribution to Wikimedia Commons. He recommended that participants:

  • Participate actively in Wikimedia Commons campaigns to increase the repository’s breadth and depth.
  • Join periodic training sessions for new contributors to gain a better understanding of Wikimedia Commons, Creative Commons licenses, and uploading techniques.
  • Actively promote Wikimedia Commons campaigns to encourage wider participation and contributions.

The webinar provided an in-depth look at the significance of Wikimedia Commons and its role in supporting various Wikimedia projects. By contributing to Commons, participants help improve the visibility of Wikimedia projects because, as the saying goes, a picture is worth a thousand words.

Conclusion

The session concluded with a Q&A segment, where participants had the opportunity to ask questions and seek further clarification on topics discussed. Azogbonon  offered words of encouragement, urging participants to engage actively in Wikimedia community campaigns and edit-a-thons. He stressed that regular practice is essential for honing their skills in contributing to Wikimedia Commons. This engaging and informative webinar not only equipped participants with the necessary tools and knowledge to contribute effectively to Wikimedia Commons but also fostered a sense of community and collaboration among Francophone Wikimedians.

For those interested in revisiting the session or those who might have missed it, the recorded version is available on the community programmes page and do well to test your knowledge on our academy Africa. Ensure you are registered for the upcoming CfA WiR Bi-weekly webinar and immerse in our vibrant community. To stay abreast of our initiatives, complete this form, and let’s shape the future together!

Dark mode is here for Wikipedia (finally!).

Dark mode has been one of the most requested features. It improves accessibility and reduces eye strain for readers and communities across Wikimedia projects by providing a low-contrast environment. The feature is now available on select wikis, on mobile and desktop, in both reading and editing mode!

While creating dark mode may seem simple, on Wikipedia and other Wikimedia projects, the project required overcoming tricky and unique challenges. The Foundation’s engineering teams worked closely with volunteer technical editors to bring this feature into fruition.

Why dark mode matters

Wikipedia readers and communities wanted a dark mode! It has been one of the most requested features from editors since at least 2019. The feature’s absence has been a constant source of grievance on the Wikimedia Foundation’s social media accounts. In addition, our analytics show that 20% of our readership has specified on the operating system level that they would prefer dark mode. While research is mixed, dark mode is often praised as reducing eye strain and helping people with medical conditions.

The question about introducing this feature has never been why, it’s always been how. First, the Wikimedia Foundation needed to overcome underlying limitations of our software and architecture, part of which was done by building out the Vector 2022 skin. We have described this in detail in our previous blog post “Dark mode is coming“. Having that done, the Web team embarked on a project to improve accessibility for readers, and we finally set out to answer the question: how can we make dark mode possible?

User-generated content

Wikipedia is a unique website, built on a massive amount of user-generated content. A lot of Wikipedia’s content is built with HTML markup and relies on the assumption that the only mode available is light. By introducing dark mode, we faced the risk of making the pages that users had carefully crafted over the years no longer accessible.

When colour is important

Sometimes colour plays an important part in a Wikipedia article. We considered a shortcut to enable dark mode, by darkening all colors on the page, but it had a worrisome trait. Consider this article on International Orange and notice below how the colour swatches are subtly different in both articles.

Subtle color differences can be dangerous. Can you tell which one is presenting International Orange accurately?

In the first example, the colours are correct. In the second image, the colours have been distorted by the “dark mode shortcut” and are actually presenting incorrect information. However, without careful consideration nothing seems wrong, and there is a risk such an issue could remain undetected.

While in this particular situation, the stakes are low, it takes little imagination to see where colour misinformation can be problematic.

Colour distortion can be confusing, humorous, and sometimes dangerous, as demonstrated above across the Wikipedia articles for Toxicity label, Color of chemicals, Hazard symbol, and Yellow. Consider the implications of a display of inaccurate colours should someone be using Wikipedia to determine what coloured gas is filling their room.

An accessible opportunity

As we began to evaluate potential fixes for the colour issues outlined above, we used the Web Content Accessibility Guidelines on colour contrast. This was to determine which pages had colours and colour combinations that might make reading more difficult. We discovered that many articles on Wikipedia had accessibility issues in their content in Wikipedia’s previous light-only state. The Wrexham A.F.C. article, for example, had inaccessible red backgrounds on all its table headers, simply because the football team plays in red (this is now fixed!).

This screenshot shows how the article for Wrexham A.F.C. looked before dark mode (left) and how it looks now (right). The article has tables with red headers because the team plays in a red kit. The work on dark mode flagged that this page had accessibility issues, and it was changed to more accessible colours while keeping the association with the brand.

While colour can have semantic use in conveying knowledge — for example, if describing the colour International orange, a colour helps. However, most of the time it is a stylistic choice. Some colours also have undesired connotations that can lead to misunderstandings of information — for example, red can be associated with warning and danger. In general, colour should be used wisely. As part of this project, we wanted to make it easier for volunteer editors to find and fix these issues with user-generated content.

Because Wikipedia can be edited by everyone, accessibility is a collective responsibility. Adding the dark mode feature provided an opportunity to improve Wikipedia content. What better way to bring greater awareness to accessibility and existing issues, than using a widely desired feature as the vehicle?

Improving our content

Despite the challenges ahead relating to improving articles, it was important not to take a shortcut and instead make our content better. After all, Wikipedia is all about a constant, collaborative process to improve the encyclopedia.

To support this work, we created tools for flagging issues and empowering editors to get involved and make our content more accessible. Because of the challenges with the existing formatting assuming light mode, we decided to be conservative with the release.

We have made dark mode on all Wikipedias for logged-in users. For not logged-in users (so most readers) though, we only release it on wikis where technical editors are fixing the color issues, or where there are few issues anyway. This includes English, Modern Written Chinese, French, Japanese, Dutch, Polish, Portuguese, and a number of smaller Wikipedias. We look forward to making dark mode available in more languages after we have worked further with their technical editors.

If you edit Wikipedia and can help with CSS and templates, we encourage you to do so. If your wiki does not have Vector 2022 as the default skin for desktop yet, and you would like to make dark mode available for everyone there, reach out to your community and to us. Dark mode only works on the Minerva (mobile) and Vector 2022 (desktop) skins.

Thank you!

This has been a journey with many insightful conversations! First, special thanks go to all volunteer editors updating the existing formatting. Next, thanks to Bernard, Edward, George, Jan, Jennifer, Justin, Kim, Mo, Olga, Nat, and Steph in the Web team, and to Anne, Barbara, Chris, Eric, Derek, Lauralyn, Roan, and Volker in the Design System team. They have been incredibly patient and responsive throughout this project with our rapidly-changing needs. Thanks also to the MediaWiki Engineering team, and especially Moriel, for kicking this project off with us and helping us get things off the ground. Thanks to Volker and MusikAnimal whose early work many years ago provided the impetus for much of this work. Our product ambassadors Bachounda, Isabel, Mehman, and Phuong, as well as our Movement Communications support Szymon have played an essential role by keeping us connected to our editing communities. As the Web’s tech lead, I headed the engineering part of the work.

In addition, thank you as always to Wikimedia communities that helped us along the way and the key contributions from other colleagues and friends that helped us scale across our codebases and content. The Wikimedia Hackathon 2024 in Estonia played an important role! Rather than miss out on someone, I’ve created this page to celebrate all those many people — please help us expand it!

As a reader, we hope you enjoy reading Wikipedia in dark mode!

Write an article on Palestinian poet on Wikipedia

Wednesday, 17 July 2024 21:30 UTC

In the past, when I wanted to know something about Palestine, I never researched anything further. Now, however, my style is to first write an article on Wikipedia. I have to do a lot of research in order to write an article, and as a result, I have a deeper understanding of Palestine, which I can then share with many other people.

The Islamic University of Gaza where Mosab Abu Toha studied

Just as I was thinking about what to write about, I learned that an anthology of Palestinian poetry appeared in the May 2024 issue of Japanese poetry magazine Gendaishi Techo (現代詩手帖), and I immediately bought it. There are 12 Parestinian poets in the magazine, but only Refaat Alareer was mentioned in the Japanese Wikipedia, who was died in an Israeli airstrike in December 2023.

So I started translating from Mosab Abu Toha, who had an article in the English Wikipedia. He was born in Palestine in 1992, studied English at the Islamic University of Gaza, and established a public library of English-language books in the Gaza Strip. It was the only English-language library in Gaza, an Arabic-speaking world. Abu Toha’s aspiration to establish a library as an oasis in a land where conflict continues unabated and to teach the meaning of knowing English was well received by the local people, and many adults as well as children visited the library. He later studied in the U.S., where he earned a master’s degree, and published his poems in various media. However, he is currently staying in Cairo, Egypt with his family after escaping bombings.

The library he founded was named the Edward Said Public Library. Said (1935-2003) was a well-known Palestinian intellectual who went to the United States, then gave lectures of English literature and comparative literature at Columbia University. He was active on Palestinian issues and in 1999 founded the West-Eastern Divan Orchestra with conductor Daniel Barenboim. It is clear that Abu Toha respected Said.

The Abu Toha article had many red links that did not exist in the Japanese version, so I decided to translate them little by little, using Wikidata as a reference. The “Edward Said Public Library” had no Wikipedia article in any language, but I found only Wikidata, so I used that as a reference. In the meantime, I found the library’s official website and there was a video showing daily life in Gaza and the library’s activities. I hope many people will watch it.

New Japanese Wikipedia articles

WikiForHumanRights in Nigeria 2024 Wikipedia Training

Wednesday, 17 July 2024 21:30 UTC

On the 5th and 6th of July 2024, the  WikiForHumanRights (W4HR) in Nigeria 2024  Campaign hosted its 1st and 2nd Wikipedia online training sessions. The sessions, titled “Introduction to Wikipedia” and “Performing Major Edits on Wikipedia,” spotlighted Precious Kenechukwu, the Senior Partnership Officer of the Climate Action Organization, who spoke about their initiatives’ alignment with the W4HR in Nigeria 2024 Campaign and how they are pioneering transformative change at the intersection of Africa’s resilience restoration and green, sustainable socio-economic development.

The session progressed with Barakat Adegboye, the Programs Lead Intern at the Wikimedia Nigeria User Group, leading the Wikipedia training with moderation support from Agnes Abah, the local coordinator for the W4HR 2024 in the Igala Community, and  Kemi Makinde, one of the National Coordinators for the W4HR in Nigeria 2024, as well as backend support from both the  working team and local coordinators.

The 2-day training, which lasted for 2 hours per day, attracted over 170 participants in total from the 7 existing communities: WUGN Anambra Network, Igala Community,  WUGN Osun, Port Harcourt Wikimedia Nigeria Hub, WUGN Kaduna Network WUGN  Imo Network  and WUGN Gombe Network. This 2-day online training provided an in-depth exploration of the overview and misconceptions of Wikipedia, getting started on Wikipedia, setting editing tools and gadgets, and hands-on demonstrations on how to perform minor and major edits on Wikipedia with a focus on topics such as sustainable agriculture, environmental sustainability, clean water, and waste management.

Key Highlights

Barakat began the training with the introduction to Wikipedia by defining the basics of Wikipedia, its principles, and facts. She presented recent statistics, noting that Wikipedia has over 60 million articles in 334+ languages, with the English Wikipedia having the most articles of any edition, totaling 6,839,278 articles as of June 2024. She proceeded by exposing participants to the core  content policies, the general and subject notability guidelines of Wikipedia. She also addressed misconceptions about Wikipedia, providing examples such as Wikipedia being a secondary source and the incorrect practice of adding opinions to Wikipedia articles as if they were facts, and she corrected these misconceptions.

The session continued with a practical guide on how to create a Wikipedia account and a user profile, including setting up the editing interface. Barakat further explored the anatomy of Wikipedia by demonstrating features such as the lead section, search box, wikilinks, badges, talk section, view history, article talk, and references.

She then discussed how to set up preferences and enable gadgets to make editing more enjoyable for participants. This was followed by a demonstration on how to use the source and visual editing tools and how to perform minor edits such as adding wikilinks, citations, images, and infoboxes to an existing Wikipedia article, using the article on  Climate change in Nigeria as a case study.

On the second day of the training, she continued the demonstration from where she had left off, focusing on creating and updating an infobox. She then showed participants how to build a comprehensive Wikipedia article page by paying attention to structure, including the lead section, headings, wikilinks, references, categories, pictures, and library resources necessary to create an engaging article.

Conclusion

The session concluded with a quick review of the progress made by the editors and a reminder about the score guide and the campaign’s do’s and don’ts to ensure participants’ contributions are counted and rewarded accordingly. Participants had the opportunity to ask questions and seek clarity on any aspects of the campaign that were unclear to them. They were also encouraged to reach out to their community coordinators and the trainers for one-on-one support.

For those who may have missed the session, the link to access it is available on the community training schedule. Additionally, we encourage you to register under any of the 7 organizing communities to be part of our upcoming training sessions and to ensure you receive timely email notifications one hour before each online session begins. Let’s work together to create knowledge for a sustainable future and bridge the gap in information about sustainable agriculture, environmental sustainability, clean water, and waste management.

A big thank you to all the Wikimedia Foundation, Coordinators, Working Team, Partners, and Participants!

Connect with us!

A group of Wikipedians in Ghana are working to educate the next generation about the vital relationship between the open movement and climate change. The project dubbed Open Climate Education for High Schools, seeks to create awareness and educate students on climate change leveraging the opportunities the open movement presents.

The project lead shares that Climate change remains one of the most pressing concerns of local and global leaders and hence a collaborative action ensuring no one is left behind must be employed in addressing it. In addition to being the President of the Eco Warriors Movement and the Project Lead for the Open Climate initiative, Otuo-Akyampong Boakye is an enthusiastic Wikipedian and environmental scientist from Ghana.

The open movement – which includes open access, open data, open source, and more with projects like Wikimedia, Kiwix, Phet, etc  – has so much potential to accelerate climate action. However, many times this potential is underutilized, particularly by younger generations. This project is to unveil the enormous potential of the open movement in addressing climate change and other environmental challenges.

Ghanaian students are being introduced to the concepts of open climate through a range of instructional workshops and programs to create awareness. Discussions are raised on how open collaboration can promote creative climate solutions, how open data can enhance climate modeling, and how open source renewable energy technologies can increase access.

Open Educational Resources (OER) provide teaching, learning, and research materials that are either in the public domain or licensed in a manner that provides everyone with free and perpetual permission to engage in free use, adaptation and sharing. OER allows the “5R activities”– retaining, remixing, revising, reusing and redistributing the resources

“The project is to spark curiosity and engagement around these issues,” Boakye opines. “We need young people to be informed, empowered, and ready to take action on climate change. The open movement can be a powerful tool in that fight, if we can just raise awareness about it.”

With project partners like the Eco Warriors Movement, Kumasi Wiki Hub and support from Creative Commons Open Education Platform, the project has the needed resources to effectively achieve its goals and objectives.

“Climate change is the challenge of our time, and we all have a role to play in addressing it,” says Boakye. “By harnessing the power of openness, transparency, collaboration and free use, I believe we can build a more sustainable, resilient future. That’s the future we fighting for, and I’m glad to have Ghanaian students on board.”

Read more about this project in this local news article.

Tunnel from old books in Municipal Library, Prague by Deror_avi – Own work, CC BY-SA 4.0

In the sprawling digital landscape of the 21st century, the Wikimedia Foundation has an increasing demand to support and provide our movement and the world at large with the infrastructure to curate human-led content and a tech-enabled knowledge system that caters for all, regardless of the language you speak.

As the Wikimedia Foundation starts the 2024–2025 fiscal year, the Product and Tech department continues to focus on knowledge equity and knowledge as a service. This extends to providing support towards creation of trustworthy encyclopedic content that encompasses all human knowledge. By working with volunteers we would like to help them identify knowledge gaps, and equip them with the tools they would need to reduce these gaps and overcome barriers that prevent continuous contributions to create encyclopedic knowledge  across all languages. To facilitate the above, the Product and Tech department has a new team called the Language and Product Localization team. This team will focus more on supporting multilingualism within the movement and providing standard-based tooling for our communities to gain grounds on localised technical initiatives that will bridge knowledge gaps and promote language equity to get closer to our strategic plan of knowledge equity.

The Language and Product localisation team is born from the fusion of the Language and Inuka team to consolidate their work of providing a range of support to all languages communities, including the underserved and underrepresented communities appropriately according to their needs . They will continue to innovate, experiment and ensure our digital products and platforms are easily adapted for different languages, cultures, and regions. Forming this team is a strategic way to remove the linguistic, cultural, and other barriers in our digital world that have frustrated, curious minds eager to learn but faced with a wall of unfamiliar languages and technology. Every day, thousands face this digital Tower of Babel, their thirst for knowledge hindered by the very tools meant to quench it. The Language and Product localisation team’s work lies in this gap – this chasm between information and understanding. 

Language and Product Localisation team’s work

The team will continually work on evolving our tools and experimentations and providing technical support to meet our contributors and people who use our platform where they are and transition them to their expected state in the free knowledge ecosystem. With the new team’s robust expertise, in a nutshell, they will:

  • Work to create and maintain tools that facilitate the use of languages on our websites using MediaWiki like the Translate extension that translators use to translate software strings and pages in their browser.
  • Provide extensive tools and features for localisation and translation used in Wikipedia and other projects. An example of this type of tool is the Universal Language Selector extension (ULS), a tool that helps people use our sites in different languages. With ULS, you can select and configure the Wikimedia platforms interface in many languages, even if your computer isn’t set up for those languages. 
  • Engage closely with communities to create features and technical support to advance their work in sharing and building knowledge. An example of providing technical support is helping communities resolve script issues in our projects by bringing them to the attention of the Unicode Consortium, now that the Wikimedia Foundation is a member of the Consortium.

They will focus on four key workstreams. The first is the former Language and Inuka team annual plan objectives, key results and hypothesis; some of the hypotheses that will be worked on are:

  • “If we build a proof-of-concept providing translation suggestions that are based on user-selected topic areas, we will be set up to successfully test whether translators will find more opportunities to translate in their areas of interest and contribute more compared to the generic suggestions currently available.”
  • By the end of Q2, support organizers, contributors, and institutions to increase the coverage of quality content in key topic areas through experiments.

You can find more information about these hypotheses and key results in this table.

The second workstream is ongoing support for essential tools and infrastructure, including continuous work on Content Translation, Machine in Translation (MinT) Wikipedia Preview, Localization infrastructure, etc. Followed by language technical support with community engagement which includes responding to i18n issues, community meetups, developer support, language onboarding, etc. And lastly, futuristic explorations and experiments that can develop into bigger features over time like the MinT project.

Why the synergy of two former teams?

The Product and Tech department consolidated the Language and Product localisation team based on the focus and need in the 2024–25 annual plan to address content and knowledge gaps. They identified an opportunity that will increase work efficiency and harness the combined skills and expertise of the former Language and Inuka teams that already have overlaps with the communities they work with; these two teams also have a shared goal of closing knowledge gaps. The department is confident that bringing them together to work more closely and focusing on tools and support systems will significantly impact this vital objective of knowledge equity.

At its core, this new team captures the Wikimedia Foundation’s work to “ensuring standards-based tooling to support multilingualism within the Wikimedia movement and advancing localised technical initiatives to reduce knowledge gaps and promote language equity.” The team is committed to breaking down language barriers and fostering a global community of knowledge creators and consumers. 

As this dynamic new team sets sail on their ambitious voyage into the 2024–25 fiscal year, we invite the movement to follow their expedition and quench your curiosity with questions to the team on their plans and work. The horizon is bright with expectations as we watch them grow, innovate, and break down barriers—transforming the landscape of free knowledge and truly opening the gates of learning to every corner of our diverse global community. Stay tuned for the journey ahead!

Some time ago I celebrated a birthday in an Italian restaurant in Haifa, and I saw a pack of pasta of a curious shape on a shelf there. I asked whether they serve it or sell it.

“No”, they told me, “it’s just a display”.

This answer didn’t satisfy me.

I added the pasta’s name, Busiate, to my shopping list.

I searched for it in a bunch of stores. No luck.

I googled for it and found an Israeli importer of this pasta. But that importer only sell in bulk, in crates of at least 12 items. That’s too much.

And of course, I searched Wikipedia, too. There’s an article about Busiate in the English Wikipedia. There also an article about this pasta in Arabic and in Japanese, but curiously, there’s no article about it in the Wikipedia in the Italian language, nor in the Sicilian language, given that this type of pasta is Sicilian.

So I… did a few things about it.

I improved the article about Busiate in the English Wikipedia: cleaned up references, cleaned up formatting, and updated the links to references.

I also improved the references and the formatting to the article about Pesto alla trapanese, the sauce with which this pasta is traditionally served.

And I cleaned up the Wikidata items associated with the two articles above: Q48852218 (busiate) and Q3900766 (pesto alla trapanese).

And I also translated all the names of the Wikidata properties that are used on these items to Hebrew. I usually do this when I do something with any Wikidata item: I only need to translate these property names once, and after that all the people who use Wikidata in Hebrew will see items in which these properties are used in Hebrew. There are more than 6000 properties, and the number is constantly growing, so it’s difficult to have everything translated, but every little translation makes the experience more completely translated for everyone.

I added references to the Wikidata item about the sauce. Wikidata must have references, too, and not only Wikipedia. I am not enthusiastic about adding random recipe sites that I googled up as references, but luckily, I have The Slow Food Dictionary of Italian Regional Cooking, which I bought in Italy, or more precisely in Esino Lario, where I went for the 2016 Wikimania conference.

Now, a book in Wikidata is not just a book. You need to create an item about the book, and another item about the edition of a book. And since I created those, I create Wikidata items for the dictionary’s original Italian author Paola Gho, for the English translator John Irving, and for the publishing house, Slow Food.

And here’s where it gets really nerdy: I added each of the sauce’s ingredients as values of the “has part” property, and added the dictionary as a reference for each entry. I initially thought that it’s overdone, but you know what?—When we’ll have robot cooks, as in the movie I, Robot, busiati col pesto trapanese will be one of the first things that they will know how to prepare. One of the main points of Wikidata is that it’s supposed to be easy to read for both people and machines.

And since I have a soft spot for regional languages, I also added the sauce’s Sicilian name under the “native label” property: pasta cull’àgghia. The aforementioned Slow Food Dictionary of Italian Regional Cooking actually does justice to the regional part in its title, and gives the names of the different food items in the various regional languages of Italy, so I could use it as a reliable source.

And I translated the Wikipedia article into Hebrew: בוזיאטה.

And I also created the “Sicilian cuisine” category in the Hebrew Wikipedia. A surprisingly large number of articles already existed, filed under “Italian cuisine”: Granita, Arancini, Cannoli, and a few others. Now they are organized under Sicilian cuisine. (I hope that some day Wikipedia categories will be managed more automatically with the help of Wikidata, so that I wouldn’t have to create them by hand.)

Finally, I found the particular issue of the Gazzetta Ufficiale of the Italian Republic, in which busiati col pesto trapanese was declared as a traditional agricultural food product, and I added that issue as a reference to the Wikidata item, as well.

And all of this yak shaving happened before I even tasted the damn thing!

So anyway, I couldn’t find this pasta anywhere, and I couldn’t buy it from the importer’s website, but I wanted it really badly, so I called the importer on the phone.

They told me they don’t have any stores in Jerusalem that buy from them, but they suggested checking a butcher shop in Mevaseret Tsiyon, a suburb of Jerusalem. Pasta in a butcher shop… OK.

So I took a bus to Mevaseret, and voilà: I found it there!

And I made Busiate, and I made the sauce! It’s delicious and totally worth the effort.

Of course, I could just eat it without editing Wikipedia and Wikidata on the way, but to me that would be boring.

My wife and my son loved it.

These are the busiate with pesto alla trapanese that I made at home. I uploaded this photo to Wikimedia Commons and added it to the English Wikipedia article as an illustration of how Busiate are prepared. I wonder what do Wikipedians from Sicily think of it.

There is a story behind every Wikipedia article, Wikidata item, and Commons image. Millions and millions of stories. I wrote mine—you should write yours!

It sometimes happens in people’s lives that someone tells them something that sounds true and obvious at the time. It turns out that it actually is objectively true, and it is also obvious, or at least sensible, to the person who hears it, but it’s not obvious to other people. But it was obvious to them, so they think that it is obvious to everyone else, even though it isn’t.

It happens to everyone, and we are probably all bad at consistently noticing it, remembering it, and reflecting on it.

This post is an attempt to reflect on one such occurrence in my life; there were many others.

(Comment: This whole post is just my opinion. It doesn’t represent anyone else. In particular, it doesn’t represent other translatewiki.net administrators, MediaWiki developers or localizers, Wikipedia editors, or the Wikimedia Foundation.)


There’s the translatewiki.net website, where the user interface of MediaWiki, the software that powers Wikipedia, as well as of some other Free Software projects, is translated to many languages. This kind of translation is also called “localization”. I mentioned it several times on this blog, most importantly at Amir Aharoni’s Quasi-Pro Tips for Translating the Software That Powers Wikipedia, 2020 Edition.

Siebrand Mazeland used to be the community manager for that website. Now he’s less active there, and, although it’s a bit weird to say it, and it’s not really official, these days I kind of act like one of its community managers.

In 2010 or so, Siebrand heard something about a bug in the support of Wikipedia for a certain language. I don’t remember which language it was or what the bug was. Maybe I myself reported something in the display of Hebrew user interface strings, or maybe it was somebody else complaining about something in another language. But I do remember what happened next. Siebrand examined the bug and, with his typical candor, said: “The fix is to complete the localization”.

What he meant is that one of the causes of that bug, and perhaps the only cause, was that the volunteers who were translating the user interface into that language didn’t translate all the strings for that feature (strings are also known as “messages” in MediaWiki developers’ and localizers’ jargon). So instead of rushing to complain about a bug, they should have completed the localization first.

To generalize it, the functionality of all software depends, among many other things, on the completeness of user interface strings. They are essentially a part of the algorithm. They are more presentation than logic, but the end user doesn’t care about those minor distinctions—the end user wants to get their job done.

Those strings are usually written in one language—often English, but occasionally Japanese, Russian, French, or another one. In some software products, they may be translated into other languages. If the translation is incomplete, then the product may work incorrectly in some ways. On the simplest level, users who want to use that product in one language will see the user interface strings in another language that they possibly can’t read. However, it may go beyond that: writing systems for some languages require special fonts, applying which to letters from another writing system may cause weird appearance; strings that are supposed to be shown from left to right will be shown from right to left or vice versa; text size that is good for one language can be wrong for another; and so forth.

In many cases, simply completing the translation may quietly fix all those bugs. Now, there are reasons why the translation is incomplete: it may be hard to find people who know both English and this language well; the potential translator is a volunteer who is busy with other stuff; the language lacks necessary technical terminology to make the translations, and while this is not a blocker —new terms can be coined along the way—, this may slow things down; a potential translator has good will and wants to volunteer their time, but hasn’t had a chance to use the product and doesn’t understand the messages’ context well enough to make a translation; etc. But in theory, if there is a volunteer who has relevant knowledge and time, then completing the translation, by itself, fixes a lot of bugs.

Of course, it may also happen that the software actually has other bugs that completing the localization won’t fix, but that’s not the kind of bugs I’m talking about in this post. Or, going even further, software developers can go the extra mile and try to make their product work well even if the localization is incomplete. While this is usually commendable, it’s still better for the localizers to complete the localization. After all, it should be done anyway.

That’s one of the main things that motivate me to maintain the localization of MediaWiki and its extensions into Hebrew at 100%. From the perspective of the end users who speak Hebrew, they get a complete user experience in their language. And from my perspective, if there’s a bug in how something works in Wikipedia in Hebrew, then at least I can be sure that the reason for it is not the translation is incomplete.


As one of the administrators of translatewiki, I try my best to make complete localization in all languages not just possible, but easy.¹ It directly flows out of Wikimedia’s famous vision statement:

Imagine a world in which every single human being can freely share in the sum of all knowledge. That’s our commitment.

I love this vision, and I take the words “Every single human being” and “all knowledge” seriously; they implicitly mean “all languages”, not just for the content, but also for the user interface of the software that people use to read and write this content.

If you speak Hindi, for example, and you need to search for something in the Hindi Wikipedia, but the search form works only in English, and you don’t know English, finding what you need will be somewhere between hard and impossible, even if the content is actually written in Hindi somewhere. (Comment #1: If you think that everyone who knows Hindi and uses computers also knows English, you are wrong. Comment #2: Hindi is just one example; the same applies to all languages.)

Granted, it’s not always actually easy to complete the localization. A few paragraphs above, I gave several general examples of why it can be hard in practice. In the particular case of translatewiki.net, there are several additional, specific reasons. For example, translatewiki.net was never properly adapted to mobile screens, and it’s increasingly a big problem. There are other examples, and all of them are, in essence, bugs. I can’t promise to fix them tomorrow, but I acknowledge them, and I hope that some day we’ll find the resources to fix them.


Many years have passed since I heard Siebrand Mazeland saying that the fix is to complete the localization. Soon after I heard it, I started dedicating at least a few minutes every day to living by that principle, but only today I bothered to reflect on it and write this post. The reason I did it today is surprising: I tried to do something about my American health insurance (just a check-up, I’m well, thanks). I logged in to my dental insurance company’s website, and… OMFG:

What you can see here is that some things are in Hebrew, and some aren’t. If you don’t understand the Hebrew parts, that’s OK, because you aren’t supposed to: they are for Hebrew speakers. But you should note that some parts are in English, and they are all supposed to be in Hebrew.

For example, you can see that the exclamation point is at the wrong end of “Welcome, Amir!“. The comma is placed unusually, too. That’s because they oriented the direction of the page from right to left for Hebrew, but didn’t translate the word “Welcome” in the user interface.² If they did translate it, the bug wouldn’t be there: it would correctly appear as “ברוך בואך, Amir!“, and no fixes in the code would be necessary.

You can also see a wrong exclamation point in the end of “Thanks for being a Guardian member!“.

There are also less obvious bugs here. You can also see that in the word “WIKIMEDIA” under the “Group ID” dropdown, the letter “W” is only partly seen. That’s also a typical RTL bug: the menu may be too narrow for a long string, so the string can be visually truncated, but it should happen at the end of the string and not in the beginning. Because the software here thinks that the end is on the left, the beginning gets truncated instead. This is not exactly an issue that can be fixed just by completing the localization, but if the localization were complete, it would be easier to notice it.

There are other issues that you don’t notice if you don’t know Hebrew. For example, there’s a button with a weird label at the top right. Most Hebrew speakers will understand that label as “a famous website”, which is probably not what it is supposed to say. It’s more likely that it’s supposed to say “published web page”, and the translator made a mistake. Completing the translation correctly would fix this mistake: a thorough translator would review their work, check all the usages of the relevant words, and likely come up with a correct translation. (And maybe the translation is not even made by a human but by machine translation software, in which case it’s the product manager’s mistake. Software should never, ever be released with user interface strings that were machine-translated and not checked by a human.)

Judging by the logo at the top, the dental insurance company used an off-the-shelf IBM product for managing clients’ info. If I ask IBM or the insurance company nicely, will they let me complete the localization of this product, fixing the existing translation mistakes, and filing the rest of the bugs in their bug tracking software, all without asking for anything in return? Maybe I’ll actually try to do it, but I strongly suspect that they will reject this proposal and think that I’m very weird. In case you wonder, I actually tried doing it with some companies, and that’s what happened most of the time.

And this attitude is a bug. It’s not a bug in code, but it is very much a problem in product management and attitude toward business.


If you want to tell me “Amir, why don’t you just switch to English and save yourself the hassle”, then I have two answers for you.

The first answer is described in detail in a blog post I wrote many years ago: The Software Localization Paradox. Briefly: Sure, I can save myself the hassle, but if I don’t notice it and speak about it, then who will?

The second answer is basically the same, but with more pathos. It’s a quote from Avot 1:14, one of the most famous and cited pieces of Jewish literature outside the Bible: If I am not for myself, who is for me? But if I am for my own self, what am I? And if not now, when? I’m sure that many cultures have proverbs that express similar ideas, but this particular proverb is ours.


And if you want to tell me, “Amir, what is wrong with you? Why does it even cross your mind to want to help not one, but two ultramegarich companies for free?”, then you are quite right, idealistically. But pragmatically, it’s more complicated.

Wikimedia understands the importance of localization and lets volunteers translate everything. So do many other Free Software projects. But experience and observation taught me that for-profit corporations don’t prioritize good support for languages unless regulation forces them to do it or they have exceptionally strong reasons to think that it will be good for their income or marketing.

It did happen a few times that corporations that develop non-Free software let volunteers localize it: Facebook, WhatsApp, and Waze are somewhat famous examples; Twitter used to do it (but stopped long ago); and Microsoft occasionally lets people do such things. Also, Quora reached out to me to review the localization before they launched in Hebrew and even incorporated some of my suggestions.³

Usually, however, corporations don’t want to do this at all, and when they do it, they often don’t do it very well. But people who don’t know English want—and often need!—to use their products. And I never get tired of reminding everyone that most people don’t know English.

So for the sake of most humanity, someone has to make all software, including the non-Free products, better localized, and localizable. Of course, it’s not feasible or sustainable that I alone will do it as a volunteer, even for one language. I barely have time to do it for one language in one product (MediaWiki). But that’s why I am thinking of it: I would be not so much helping a rich corporation here as I would be helping people who don’t know English.

Something has to change in the software development world. It would, of course, be nice if all software became Freely-licensed, but if that doesn’t happen, it would be nice if non-Free software would be more open to accepting localization from volunteers. I don’t know how will this change happen, but it is necessary.


If you bothered to read until here, thank you. I wanted to finish with two things:

  1. To thank Siebrand Mazeland again for doing so much to lay the foundations of the MediaWiki localization and the translatewiki community, and for saying that the fix is to complete the localization. It may have been an off-hand remark at the time, but it turned out that there was much to elaborate on.
  2. To ask you, the reader: If you know any language other than English, please use all apps, websites, and devices in this language as much as you can, bother to report bugs in its localization to that language, and invest some time and effort into volunteering to complete the localization of this software to your language. Localizing the software that runs Wikipedia would be great. Localizing OpenStreetMap is a good idea, too, and it’s done on the same website. Other projects that are good for humanity and that accept volunteer localization are Mozilla, Signal, WordPress, and BeMyEyes. There are many others.⁴ It’s one of the best things that you can do for the people who speak your language and for humanity in general.

¹ And here’s another acknowledgement and reflection: This sentence is based on the first chapter of one of the most classic books about software development in general and about Free Software in particular: Programming Perl by Larry Wall (with Randal L. Schwartz, Tom Christiansen, and Jon Orwant): “Computer languages differ not so much in what they make possible, but in what they make easy”. The same is true for software localization platforms. The sentence about the end user wanting to get their job done is inspired by that book, too.

² I don’t expect them to have my name translated. While it’s quite desirable, it’s understandably difficult, and there are almost no software products that can store people’s names in multiple languages. Facebook kind of tries, but does not totally succeed. Maybe it will work well some day.

³ Unfortunately, as far as I can tell, Quora abandoned the development of the version in Hebrew and in all other non-English languages in 2022, and in 2023, they abandoned the English version, too.

⁴ But please think twice before volunteering to localize blockchain or AI projects. I heard several times about volunteers who invested their time into such things, and I was sad that they wasted their volunteering time on this pointlessness. Almost all blockchain projects are pointless. With AI projects, it’s much more complicated: some of them are actually useful, but many are not. So I’m not saying “don’t do it”, but I am saying “think twice”.

On June 28, 2024, the WikiForHumanRights (W4HR) in Nigeria 2024 Campaign hosted its virtual launch, attracting 100 participants from 21 states in Nigeria. The event, led by 2 National coordinators, 7 community coordinators, and 6 working team members as part of the W4HR 2024 international campaign, celebrated the 75th anniversary of the Universal Declaration of Human Rights. The meeting lasted for 2 hours and aimed to encourage various Wikimedia communities to create knowledge that documents: “How Human Rights Knowledge Creates a Sustainable Future.”

The W4HR in Nigeria 2024 campaign focuses on raising awareness and understanding of human rights issues related to sustainability in Nigeria. It addresses topics such as sustainable agriculture, environmental sustainability, clean water, and waste management. The campaign identifies significant knowledge gaps on Wikipedia, Wikidata, Wikimedia Commons, and Wikivoyage regarding these issues in Nigeria, particularly in local languages like Igala, Yoruba, Igbo, Tyap, and Hausa, which hampers community awareness and engagement. The campaign aims to achieve a target of 1,230 articles, items, and media files through content expansion, translation, and creation. To reach this goal, 148 new and existing editors were recruited across 7 communities, exceeding expectations before the virtual launch with 228 registered participants. These participants include students and professionals from different communities, who are passionate about enhancing quality of information and accessibility, aligning efforts with the Sustainable Development Goals (SDGs), and international human rights standards.

Key Highlights

The virtual launch, moderated by Miracle James, commenced with an inspiring talk from Euphemia Uwandu, Program Officer for Campaign Programs at the Wikimedia Foundation. Euphemia, who also serves as the Coordinator and general overseer of the WikiForHumanRights Campaigns, set the tone by providing a brief introduction to the campaign, highlighting its aims, objectives, and goals, and encouraging participants to contribute. The session proceeded to showcase one of our partners as part of our Partners Spotlight activities, aiming to educate participants on how they can contribute to the campaign by learning from Civil Society Organizations (CSOs) and Non-Governmental Organizations (NGOs) working on similar themes. The spotlight was on Plogging Nigeria, a non-profit organization that promotes environmental sustainability through organized activities called ‘Plogging Episodes’ among others. Mayokun Iyaomolere, the Founder of Plogging Nigeria, spoke about their initiatives aligning with the W4HR in Nigeria Campaign, with a specific focus on waste management and environmental sustainability.

The session then advanced with an overview from Kemi Makinde, one of the National Coordinators for the W4HR in Nigeria 2024, who enlightened participants on the theme and relevance of the campaign, as well as the timeline of implementation. This was followed by an introduction to the experienced trainers and reviewers who will be leading the general training, as documented on the campaign’s homepage. The trainers and reviewers, carefully selected based on their previous experience in leading similar trainings and coordinating campaigns, included:

  1. Rhoda James– is the Wikidata Trainer for the W4HR in Nigeria 2024 and the creative director at the Wikimedia User Group Nigeria.
  2. Iwuala Lucy– is the Wikivoyage and Wikimedia Commons Trainer and Reviewer for the W4HR in Nigeria 2024 and a language professional with a degree in Language Studies. She is an advocate for indigenous language revitalization and free knowledge dissemination. She is also a member of the Regional Grant Committee, Charter Electoral Committee and Social Media Manager/Member, Wiki Loves Monuments International Team
  3. Barakat Adegboye– is the W4HR in Nigeria 2024 Wikipedia Trainer and a Wikimedia Volunteer from Nigeria. She also serve as the Programs Lead (Intern) at the Wikimedia User Group Nigeria.
  4. Muib Shefiu– is an experienced Wikimedia Editor who has created hundreds of articles on the English Wikipedia. He has organised and facilitated different Wikimedia programmes. He is the founder of Afrodemics and the winner of the 2023 Wikimedia User Group Nigeria Editor of the year award. Currently, he is a new page reviewer on the English Wikipedia and also a reviewer for this Campaign. 
  5. Blessing Linason – is the Wikidata Reviewer for the W4HR in Nigeria 2024 and the Co-founder Kwara State University Wikimedia Fan Club.

The session further proceeded with an introduction to the local coordinators who shed more light on their individual community activities and expectations from their members as champions contributing to this campaign. The communities included:

  1. WUGN Anambra Network led by Dr. Ngozi Osuchukwu
  2. Igala Community led by Agnes Abah
  3. WUGN Osun led by Adetoro Praise
  4. Port Harcourt Wikimedia Nigeria Hub led by Jeremiah Ugwulebo
  5. WUGN Kaduna Network led by Ramatu A Haliru
  6. WUGN Imo Imo Network led by Emmanuel Obiajulu
  7. WUGN Gombe Network led by Ismael Atiba

Upon completion of the introductions from both local coordinators and the working team, as one of the National Coordinators for this campaign, I (Bukola James) took the lead in walking participants through the Campaign  home page, resources, topic List, and reports to support their contributions. This was followed by a practical explanation by the reviewers, explaining the score guide and criteria for rewards at the end of the project.

Conclusion

The virtual launch concluded with an interactive Q&A session where participants had the opportunity to express their concerns and ask questions for clarity. At the end of the Q&A, participants had the opportunity to interact and network among themselves.

For those who may have missed the session, the link to access it is available on the community training schedule. Additionally, we encourage you to register under any of the 7 organizing communities to be part of our upcoming training sessions and to ensure you receive timely email notifications one hour before each online session begins. Let’s work together to create knowledge for a sustainable future and bridge the gap in information about sustainable agriculture, environmental sustainability, clean water, and waste management.

A big thank you to all the Wikimedia Foundation, Coordinators, Working Team, Partners, and Participants!

As part of the Wiki Women in Red @8 campaign in 2023, we collaborated with various organizations, such as the University of Professional Studies Office of the Women’s Commissioner and the former Women’s Commissioner for the University of Ghana to conduct a series of workshops and Trainings empowering over 60+ people new editors through various in-person and online workshop. There was also an online contest for both existing and new editors to participate. These workshops educated and empowered women especially with Wikipedia and Wikimedia Commons Skills. Part one of this article provided an overview of the campaign and the various activities which took place. In part two of the article (this post) we highlight some of the challenges and how we navigated our way through the campaign and what participant had to say.

Challenges

IP block

One of the main challenges encountered throughout this training was IP Block on English Wikipedia. Although many of the participants were eager to edit Wikipedia, this was a major challenge. To counter this challenge we asked participants to create their Wikipedia account through Wikimedia Commons. We also tried to create wikipedia accounts for others using the dashboard. And later on requested participants to email Graham with their username and IP. Even after all these steps they still couldn’t edit on English Wikipedia. This experience was very discouraging for the participants as they are eager to make improvements to the wiki page and to practice. As a result we had to focus more on creating articles on Wikidata and uploading Images on Wikimedia Commons which was fulfilling. At the training most did not have images about women so resorted to uploading some exiting images not related to the topic for their hands-on practice. We noticed that this experience is usually encountered by Newbies. We hope something can be done about this for future events. 

Lack of sources for women articles

Sources can be a challenging when writing about women. Example – There are a lot of women football teams in Ghana, however one for the challenges was the fact that they lacked sources/references to create the articles. For those that even had some few publications not much was documented about them. In future we will also push some efforts into translating existing articles 

Notability on Wikidata

Wikidata was one of the projects the campaign focused on. Some Wikidata articles that were documented by the volunteers were deleted due to some notability criteria which some of the team members felt the subject was notable. We believe that issue of notability is not universal and can vary from community to community. Nevertheless we ensured that guidelines are adhered to avoid further deletion.

Excess participants

Another challenge was that the campaign attracted a lot of new participants, over a 100 new  participants on our whatsApp group who were eager to learn new skills using wikipedia. As a result we were unable to extend one on one support and host all members during the in person training although they were eager to participate. Another challenge was also the location which required us to host them, this didn’t now allow us to accommodate participants from afar, however we held general online training session which some benefitted from.

Challenge using Dashboard

Using the dashboard was great, however we had a lot of work digging into the data to discover what was relevant for our campaign. In doing that exercise we realized that a lot of articles that were edited within the period of the campaign were captured on the dashboard. We found this by checking the article history. That exercise, although cumbersome, helped us to identify the actual articles and contributions made by participants of the campaign.

Event registration tool

Another challenge we encountered was about creating the event registration page. First challenge encountered was the delay in granting our request to be able to create the campaign page.  We had to make a request for an organizer right before we could actually create an event registration tool. Due to this delay we used Goggle form to start recruiting and then switched back to the event registration tool when it was ready leading to duplication of registration.

Secondly we had already put up our main campaign page already set up and registration ongoing. We also learned that we could not build the registration tool on the existing campaign meta page. this led us to create a separate registration page. In future we will take note of this as having the event tool on the main campaign page minimize duplication.

After the approval, we couldn’t embed the registration tool on the existing campaign page we created. We learned that it should have been created from the onset of starting a meta part. This led us to create another page for the event registration.

Another things we learned was that because we had created a dashboard already with editors signed on already, we couldn’t also link that dashboard to the event registration page that was being created because that was not possible so we maintained the already created dashboard.

We also had multiple events we were recruiting at the same time. This led us to use Google form alongside since we did not want to create another event registration page. 

We couldn’t add our own questions to the event registration questions for evaluation purposes

In spite of the challenges with the registration tool, we found it very easy to and had features that provided us information about participants, also we were  able to send messages through the event registration page to participants to keep them posted. This tool is a game changer for wikimedia organizers and more impactful with continues improvement it will serve its ultimate purpose in the community.

Recommendations

Although the on wiki registration tool is great there are still some improvements needed to make it easier for organizers to use . Some of the recommendations from our team are;

1. All organizers should be given access rights to be able to create their event registration page or Speedily granting organizers to create registration pages when request is made.

2. We recommend a way that is flexible for the event page to be adjusted and refreshed- Another question we asked was what happens if the same page is maintained for another event the following year. Would that mean we keep creating a separate event page or how can we amend the existing registration to make way for the new campaign?

3. More education is needed around the limitation of the tool to help organizers mitigate ahead. Eg you cannot create registration page on an already existing meta page, The registration page creation works along with the dashboard creation. You cannot embed your existing dashboard to the registration page.

4. In future we hope that the registration tool can adapt some of the features from the Google form in terms of providing downloadable infographics of summary characteristics of the registered participants and ability to download

5. Event registration tool questions should be made flexible for other questions to be added to meet organizers needs. Eg we are not able to even ask the origin of the participants, and other questions we will want to understand about our target population.

6. There should be an option for on wiki messaging tool that sends messages to participants whose emails were not linked to their account to be able to receive update notifications on then wikipedia account . At the moment the mass messaging is only for emails.

Opportunities

This campaign opened a lot of opportunities for us as an organization. This includes partnering with the University of professional Studies who is looking forward to more of such training for their students and the University of Ghana. 

After learning about our Wikipedia training program, we were reached by Doctor Yaw at the Center for Climate Change and Sustainability Studies at the university of Ghana for a partnership with their institutions to incorporate Wikipedia education as part of their semester learning activities for Masters Students and undergraduate students. During our partnership discussion we identified possible collaboration opportunities that also included starting a wiki club for the Sustainability and Climate change Association. We see that this could be one of the long term goals to build their capacity on wikipedia , map content gap areas, and bridge knowledge gap areas around sustainability and Gender as that is also one of their work.  We foresee the possibility of engaging with them as part of the WikiForHumanRights Campaign.

To sustain our communities interest and offer continued support, we constantly share online events eg from the Let’s Connect Telegram as well as other events to benefit from. Continues program and engagements are needed to help

Resources and Support

Some of the ways we resolved IP Block issues was guiding participants to send emails to an admin requesting IP Unblock. While some felt it was a long process others were able to make the request and had IP unblock days after the training which allowed them to improve existing Wikipedia articles. 

We also had office hours to help them with IP Unblock as well as made a video recording on how to get IP Unblock which we hosted on our Youtube Channel. 

We also created a special Wikipedia Community WhatsApp group where we enrolled all that were interested in the campaign to learn about Wikipedia. The campaign attracted many people to join our Wikipedia and we currently have 190 members where we will be engaging from time to time to build their capacity on Wikipedia

As part of the resources we also created an article list for participants to start with, almost all of which have been created. We also shared article list for Wiki Women in Red

We received tremendous support from 2 staff at the wikimedia foundation event registration team Euphemia and Ilana who trained us on how to use the event registration tool . They also assisted us to activate wiki bulk sms to send on invitation to participants who could be potential candidates for the campaign. Although that also had its own limitations, it was exciting to have made use of it to invite 20 experienced editors to join the contest although we didn’t get much response from that outreach. 

We also had massive support from Wiki Women in Red liking and resharing our campaign on Twitter which boosted the campaigns visibility. This was really encouraging. We also exchanged some beautiful swags  with Wiki Women in Red in Scotland as a kind gesture and some of the swags they sent us were given to contributors of the campaign. 

Experienced Editors: Having experienced editors providing us with guidance and support was very encouraging and contributed to the success of the program. With their wealth of experience they provided support where they could. They assisted with me mentoring selected participants who demonstrated willingness to learn. They also trained my team to perform some tasks like editing the metapage, creating the dashboard, leading some of the training amongst others. This has really increased the editing skills of our team and some have continued to join other campaigns contributing to translations. Some of the new bees have joined our wikipedia team to support us during future trainings and campaigns.

Special thanks to the wonderful team and resource persons who supported with the trainings; Jesse Aseidu Akrofi, Ruby D-Brown, Queen Murjanatu, Garbrialla, Anita Ofori, Kojo Owusu, Phillip.

Results

Although the theme was documenting articles about women in sport, we realized that participants wanted to not only document women in sport. When we asked them what topics they were interested to contribute to, Women in Tech, Women Empowerment, stood in addtion to women in sports. We went on to create wiki data items and wikipedia articles about women sport organization/teams and organizations that are into women, Women in sustainability education etc. 

We were cognisant of the fact that the dashboard sometimes tracks other sources of contributions. Although some of the newcomers who joined wrote about other subjects which were not about women, for the purposes of the impact attained we tracked articles related to women on the dashboard and here is the statistics we gathered.

Articles related to women that we tracked from the dashboard were;

Wikipedia article improved and newly created – 658

New Wikidata Items- 600+

Wikimedia Commons images – 972

Testimonials

The testimonial participants gave after the event were encouraging as  most of them were excited about gaining this new knowledge and are enthused. We  have shared video testimonials. 

“Impressive”

“Educational”

“Wiki women in red adding our voice as women”

“The best, the fun, the zeal”

“It’s absolutely great to gain editing skills on Wikipedia.”

“The best people to lift women are women so let’s work together and support each other to get to the top”

“Much grateful for this initiative, i have learnt a lot within a short period of them and this has opened my minds to a lot of things on wikipedia”

“Wiki Women in Red workshop is quite an essential training workshop for women. It does not only promote women’s image and brand on the internet but equips others with very important digital skills in today’s world.”

“This is a good initiative. We need more of these. African Women deserve a spotlight too. Thank you so much for doing this!”

“I just love volunteering to anything about Wikimedia Foundation”

“That it’s easy and all it takes it dedication”

How has the Wikipedia Training changed your perspective about Wikipedia?

“I appreciate how far authentic recognition goes and the verification process of information”

“Anyone can edit on Wikipedia but one need to follow the five rules”

“It has helped me a lot as a blogger and a digital marketer, i have learned a new skill through this program, a skill have been wishing to learn for long.”

“This training has helped me realize that we can put more notable people out there because it’s people like me that put the information on wikipedia”

“Anyone can edit wikipedia”

“They are into women sustainability group that helps young ladies”

“I thought it was difficult and complicated to work edit on Wikipedia or I thought Wikipedia information was from an advanced source but now I know that it is simple”

“The training had thought me about the gender bias on Wikipedia and the need to level up”

“That Wikipedia can be a reliable source of information”

“Women need to be represented and works edited on Wikipedia is for everyone. Anyone can correct whatever information you put out there but credit will still be given to you”

“Wikipedia is free to edit and is for everyone”

Data

Gender

  • 75% – Women
  • 25% – Men

Age Group

  • 63.5% – 20-25
  • 12% – 25 – 30
  • 18.8% – 35 +

Is this your first time hearing about Wiki Women in Red?

  • 50%- Yes
  • 50%- No

Is this your first time participating in a Wikipedia campaign/Training?

  • 68.8%- Yes
  • 31.3% – No

If  YES were you able to create your wikipedia account?

  • 93.8% -Yes
  • 6.3% – No

Gallery

IA Upload upgraded

Tuesday, 16 July 2024 09:49 UTC

Fremantle

· IA Upload · PHP · upgrades · Wikimedia ·

I shifted IA Upload on to a new server today, where it's running on Debian 12 and PHP 8.2. So that means it's time to upgrade the tool's PHP dependencies, and as it's a Slimapp app, it seems that the first step is to get simplei18n working with a more modern version of Twig. So it's not going to get done today, it seems…

Wikimedia Côte d’Ivoire’s plans for 2027

Tuesday, 16 July 2024 09:48 UTC

“By 2030, Wikimedia will become the essential infrastructure of the free knowledge ecosystem, and all those who share our vision will be able to join us.” As a stakeholder in the development of the Wikimedia movement’s 2030 strategy, the Côte d’Ivoire User Group adheres to the ideals of equity and service advocated by the aforementioned strategic direction. It has been deploying a strategic action plan at the local level since January 2024.

The Ivorian Wikimedia community met in 2021 and 2022 to explore and discuss local needs, while integrating the recommendations of the international Wikimedia movement for the 2030 horizon. This participatory approach has laid the foundations for an ambitious strategy, articulated around five key priorities, designed to stimulate engagement, collaboration and impact in Côte d’Ivoire, West Africa and the French-speaking world.

This strategic proposal was submitted and adopted at the General Assembly in April 2023. this decision marks the start of a new era for the organization, which is firmly committed to shaping the future of free and open knowledge in Côte d’Ivoire.

Between now and 2027, Wikimedia Côte d’Ivoire will have to invest in all Wikimedia platforms by developing more local content, enhancing the involvement of volunteers, opening up to new partnerships and developing strategic tools, while respecting the standards of the Wikimedia movement”, we can read at the page … of the  strategic action plan.

At the heart of this plan lies an unwavering commitment to enriching Wikimedia platforms with local content, aimed at preserving and promoting the richness of Ivorian and African culture on a global scale. At the same time, Wikimedia Côte d’Ivoire will strive to forge strategic partnerships with key players, whether traditional (Galery, Library, Arts and Museums) GLAMs, the Education-Training sector, the media, public institutions, civil society organizations or even private companies, in order to strengthen access to knowledge and amplify its impact across the country.

At the same time, the organization will focus on institutional and community development, as well as investment in the human and material resources needed to sustain its activities. Finally, particular attention will be paid to mobilizing diversified financial resources, crucial to guaranteeing the long-term viability of its initiatives.

The strategic axes defined by Wikimedia Côte d’Ivoire reflect its deep commitment to the promotion of free knowledge and equity in access to information. By following these axes, the organization actively contributes to the realization of the Wikimedia Foundation’s global vision, shaping a future where knowledge is accessible to all, without barriers or borders.To find out more about Wikimedia Côte d’Ivoire’s strategic direction and impact, please visit https://shorturl.at/efJKY.

Redesigned Wikimedia wishlist is open

Tuesday, 16 July 2024 04:40 UTC

Fremantle

· Wikimedia · Community Tech · work ·

The new system for the Community Wishlist was launched yesterday. It replaces the old annual system of having a set period each year when people can propose wishes, with some weeks following of voting etc. In the new system, wishes get submitted whenever, and are gathered together into focus areas and those are what will be voted on (again at any time).

I think it's an improvement. The software for running it certainly is! We've built a data entry form, which reads and writes a wikitext table. There are also other parts that read all the wish templates into a (Toolforge) database and then write out various tables (all wishes, recent ones, etc.) into wiki pages.

There's more info about the launch in a Diff post: Share your product needs with the Community Wishlist

In Janice Radway’s classic Reading the Romance of 1984, she referred to the romance-purchasing customers of a small-town bookstore as a “female community … mediated by the distances of modern mass publishing. Despite the distance, the Smithton women feel personally connected to their favorite authors because they are convinced that these writers know how to make them happy” (Radway 1991, 97).

Reading the Romance is an important work because it gave attention to an otherwise dismissed genre and conceived of the readership as a community, even if only vaguely. Radway partly improved on this in her 1991 edition, admitting her theorization of community was “somewhat anemic in that it fails to specify precisely how membership in the romance-reading community is constituted.” Radway conceded the concept of an “interpretative community” (previously used to refer to critics and scholars of literature) might help, but “it cannot do complete justice to the nature of the connection between social location and the complex process of interpretation” (Radway 1991, 8).

This notion of “interpretive community” was coined in the seven years between her first and second editions. And, as she noted, it wasn’t a great fit. An “interpretive community” is a “collectivity of people who share strategies for interpreting, using, and engaging in communication about a media text or technology” (Lindlof 1988, 2002). Radway’s subjects shared little of this.

Rather, Radway was speaking of parasocial relationships between the readers and the author where mass media permit an “illusion of a face-to-face relationship with the performer” (Horton and Wohl 1956, 215)—the authors, in Radway’s case.

It’s interesting that while the concept of parasociality had existed for decades, Radway overlooked it and instead reached for the wrong one: interpretive communities.

References

Horton, Donald, and R. Richard Wohl. 1956. “Mass Communication and Para-Social Interaction.” Psychiatry 19 (3): 215–29. http://dx.doi.org/10.1080/00332747.1956.11023049.
Lindlof, Thomas R. 1988. “Media Audiences as Interpretive Communities.” Annals of the International Communication Association 11 (1): 81–107. http://dx.doi.org/10.1080/23808985.1988.11678680.
———. 2002. “Interpretive Community: An Approach to Media and Religion.” Journal of Media and Religion 1 (1): 61–74. http://dx.doi.org/10.1207/S15328415JMR0101_7.
Radway, Janice. 1991. Reading the Romance: Women, Patriarchy, and Popular Literature. Chapel Hill: University of North Carolina Press.

Tech News issue #29, 2024 (July 15, 2024)

Monday, 15 July 2024 00:00 UTC
previous 2024, week 29 (Monday 15 July 2024) next

Tech News: 2024-29

weeklyOSM 729

Sunday, 14 July 2024 09:59 UTC

04/07/2024-10/07/2024

lead picture

Gallery of Overpass Ultra map examples [1] | © dschep | map data © OpenStreetMap contributors

Mapping campaigns

  • The humanitarian collaborative mapping campaign in response to the 2024 Rio Grande do Sul Floods (Brazil) is ongoing. The effects of the disaster that led to landslides, floods, and a dam collapse persist and 5,000 people are still homeless in the state. Everyone can collaborate in the open projects.

OpenStreetMap Foundation

  • OpenStreetMap experienced a DDoS attack on Thursday 11 July, causing significant access issues and intermittent service disruptions, which the technical team is actively working to resolve.

Events

  • The State of the Map Working Group is happy to announce that ticketing and programme websites for SotM 2024 are now accessible. Early bird tickets are available at a discounted price until Wednesday 31 July.
  • Did you miss the call for general and academic presentations for the State of the Map 2024? You can still showcase your project or map visualisation by submitting a poster before Sunday 25 August. For inspiration take a look at the posters from SotM 2022.
  • The SotM France 2024 videos are now available on PeerTube .
  • The State of the Map US 2024 highlighted some new developments in pedestrian mapping, the integration of AI into mapping processes, and climate and historical data projects, with presentations on accessibility mapping, OpenStreetMap data validation, and participatory GIS for public land management.

Education

  • The IVIDES.org carried out a hybrid workshop on collaborative mapping with OpenStreetMap and Web mapping using uMap, for a group of geography students from the Federal University of Ceará (Brazil), Pici campus (Fortaleza) and the general public. Raquel Dezidério Souto wrote about this experience in her diary and the files and video are available in Portuguese.

OSM research

  • Lasith Niroshan and James D. Carswell introduced DeepMapper, an end-to-end machine learning solution that automates updates to OpenStreetMap using satellite imagery.

Maps

  • [1] TrailStash, ‘the home for #mapping projects by @dschep’, tooted that they have created a gallery of Overpass Ultra map examples.

OSM in action

  • Bristow_69 noted that the Dialogues en Humanités festival is using a nice OpenStreetMap-based map, but unfortunately has not given proper credit to OpenStreetMap.
  • EMODnet’s (European Marine Observation and Data Network) map viewer includes base and feature layers from OpenStreetMap.
  • NYC Street Map represents an ongoing effort to digitise official street records, bring them together with other street information, and make them easily accessible to the public. The app was developed with OpenMapTiles and OSM contributors’ data. Users can find the official mapped width, name, and status of specific streets and how they may relate to specific properties. It is possible see how the street grid has changed over time in a chosen area.
  • Ola Cabs have replaced Google with OSM in their Ola Maps navigation application. The change aimed to reduce costs and provide faster, more accurate searches and improved routing. This transition is part of Ola’s broader strategy to improve users’ experience and independence of navigation technology, which was first introduced in its electric vehicles with MoveOS 4 earlier this year.
  • UtagawaVTT maintains the web platform Opentraveller, where contributors can register their mountain bike and electric bike travel routes and consult online data.

Software

  • HOT has released the production version of fAIr, an assistant for mapping with AI, to a wider audience of OSM communities. The software has been tested and the production website is now accessible (login with your OSM account).
  • Adam Gąsowski has introduced his OSM Helper UserScript, designed to streamline the use of community-built tools by automatically generating relevant links based on what the user is looking at. Future plans include integrating AI for automated tagging and developing a browser extension for Chrome and Firefox.
  • Gramps Web, the open-source, self-hosted family tree application, has added a historical map layer based on OpenHistoricalMap.
  • The 20.1.0.1 beta release of Vespucci included numerous updates, such as the removal of pre-Android 5 code, improvements to error handling and memory management, enhancements to the property editor, and new features such as GeoJSON label support and layer dragging.

Programming

  • MapBliss is an R package for creating beautiful maps of your Leaflet adventures. It allows users to create print-quality souvenir maps, plot flight paths, control label positions, and add custom titles and borders. The package integrates several dependencies and is open for contributions and feature requests.
  • Mattia Pezzotti is documenting his progress in integrating Panoramax with OpenStreetMap as part of Google Summer of Code 2024, providing weekly updates on new features and improvements such as viewing 360-degree images, adding filters, and improving the user interface. This ongoing project was previously covered in weeklyOSM 723.
  • JT Archie described how they optimised large-scale OpenStreetMap data by converting it to a SQLite database, using full-text search and compression techniques, in particular the Zstandard seekable format, to handle data efficiently and improve query performance.

Did you know …

  • … the release of Taiwan TOPO v2024.07.04 continues the tradition of weekly updates started in September 2016? Taiwan TOPO provides detailed topographic data for Taiwan.

OSM in the media

  • In an op-ed in The New York Times, Julia Angwin criticised society’s overreliance on turn-by-turn navigation in Google Maps and calls for greater investment in OpenStreetMap as a public good.

Other “geo” things

  • The Ammergauer Alpen natural park has implemented a visitor monitoring system using sensors and GPS data to manage and protect natural areas while supporting sustainable tourism.
  • Geomob has tooted about the release of the episode #241 of their Geomob podcast, which covers a wide variety of issues, such as the distortion of some electoral maps and the use of drones in agriculture.
  • The Olympic torch relay route can be viewed on the Paris 2024 official website. The uMap Trajet Flamme Olympique 2024, created by @IEN52, shows all the 67 stages of the parcours, including overseas territories. Some other uMaps show the passage of the Olympic Torch in selected cities.
  • The Philippines’s Second Congressional Commission on Education and the Department of Education are partnering to conduct a comprehensive nationwide mapping of private schools starting this July. This initiative aims to inform government policies, optimise resource allocation, and enhance complementarity between the public and private education systems.
  • TomTom and East View Geospatial have partnered to provide Australia’s Department of Defence with global map data, leveraging TomTom’s Orbis Maps for accurate geospatial information critical to national security and disaster response. TomTom’s Orbis Maps is made by conflating open data from Overture and OSM with TomTom partners’ data and TomTom’s proprietary data in a controlled environment.
  • Marcus Lundblad has published his annual ‘Summer Maps’ blog post for 2024, with updates to map visualisations, improvements to search functionality and dialogue interfaces, the addition of a playground icon, support for public transport routing, and the introduction of hill shading for showing terrain topology.
  • Researchers at the Sun Yat-sen University, in collaboration with international experts, have detailed, in the Journal of Remote Sensing, a framework for building extraction using very high-resolution images in complex urban areas, addressing the limitations of existing datasets for urban planning and management.

Upcoming Events

Where What Online When Country
Salt Lake City OSM Utah Monthly Map Night 2024-07-11 flag
Lorain County OpenStreetMap Midwest Meetup 2024-07-11 flag
Amsterdam Maptime Amsterdam: Summertime Meetup 2024-07-11 flag
Berlin DRK Online Road Mapathon 2024-07-11 flag
Wildau 193. Berlin-Brandenburg OpenStreetMap Stammtisch 2024-07-11 flag
Zürich 165. OSM-Stammtisch Zürich 2024-07-11 flag
Bochum Bochumer OSM-Treffen 2024-07-11 flag
Bangalore East OSM Bengaluru Mapping Party 2024-07-13 flag
Portsmouth Introduction to OpenStreetMap at Port City Makerspace 2024-07-13 – 2024-07-14 flag
København OSMmapperCPH 2024-07-14 flag
Strasbourg découverte d’OpenStreetMap 2024-07-15 flag
Richmond MapRVA – Bike Lane Surveying & Mapping Meetup 2024-07-16 flag
England OSM UK Online Chat 2024-07-15 flag
Missing Maps London: (Online) Mid-Month Mapathon 2024-07-16
Bonn 177. OSM-Stammtisch Bonn 2024-07-16 flag
Hannover OSM-Stammtisch Hannover 2024-07-17 flag
Łódź State of the Map Europe 2024 2024-07-18 – 2024-07-21 flag
Zürich Missing Maps Zürich Mapathon 2024-07-18 flag
Annecy OSM Annecy Carto-Party 2024-07-18 flag
OSMF Engineering Working Group meeting 2024-07-19
Cocody OSM Africa July Mapathon – Map Ivory Cost 2024-07-20 flag
München Mapathon @ TU Munich 2024-07-22 flag
Stadtgebiet Bremen Bremer Mappertreffen 2024-07-22 flag
San Jose South Bay Map Night 2024-07-24 flag
Berlin OSM-Verkehrswende #61 2024-07-23 flag
[Online] OpenStreetMap Foundation board of Directors – public videomeeting 2024-07-25
Lübeck 144. OSM-Stammtisch Lübeck und Umgebung 2024-07-25 flag
Wien 72. Wiener OSM-Stammtisch 2024-07-25 flag

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by Aphaia_JP, MatthiasMatthias, PierZen, Raquel Dezidério Souto, Strubbl, TheSwavu, YoViajo, barefootstache, derFred, mcliquid, miurahr, rtnf.
We welcome link suggestions for the next issue via this form and look forward to your contributions.

Teaching AI in Schools

Saturday, 13 July 2024 03:30 UTC

Artificial Intelligence (AI) is a hot topic these days, and it’s natural to wonder how it fits into education. In this article, we will explore the best practices, concerns, and recommendations for integrating AI into school curriculums. I will also provide references to useful tools and learning materials. Importance of AI education at schools Why is there a growing interest in teaching AI in schools? AI has become deeply integrated into society, creating new applications and possibilities while also introducing ethical concerns.

A number of tools hosted on Toolforge rely on the replicated MediaWiki databases, dubbed "Wiki Replicas".

Every so often these servers have replication lag, which affects the data returned as well as the performance of the queries. And when this happens, users get confused and start reporting bugs that aren't solvable.

This actually used to be way worse during the Toolserver era (sometimes replag would be on the scale of months!), and users were well educated to the potential problems. Most tools would display a banner if there was lag and there were even bots that would update an on-wiki template every hour.

A lot of these practices have been lost since the move to Toolforge since replag has been basically zero the whole time. Now that more database maintenance is happening (yay), replag is happening slightly more often.

So to make it easier for tool authors to display replag status to users with a minimal amount of effort, I've developed a new tool: replag-embed.toolforge.org

It provides an iframe that automatically displays a small banner if there's more than 30 seconds of lag and nothing otherwise.

As an example, as I write this, the current replag for commons.wikimedia.org looks like:

The replica database (s4) is currently lagged by 1762.9987 seconds (00:29:22), you may see outdated results or slowness. See the replag tool for more details.

Of course, you can use CSS to style it differently if you'd like.

I've integrated this into my Wiki streaks tool, where the banner appears/disappears depending on what wiki you select and whether it's lagged. The actual code required to do this was pretty simple.

replag-embed is written in Rust of course, (source code) and leverages in-memory caching to quickly serve responses.

Currently I'd consider this tool to be beta quality - I think it is promising and ready for other people to give it a try, but know there are probably some kinks that need to be worked out.

The Phabricator task tracking this work is T321640; comments there would be appreciated if you try it out.

ഭാഷ തടസ്സമാകാതിരിക്കാൻ സഞ്ചാരികളെ സഹായിക്കാൻ AI Kiosk കൾ സ്ഥാപിക്കും എന്ന മന്ത്രി മുഹമ്മദ് റിയാസ് നിയമസഭയിൽ പറഞ്ഞെന്ന് പത്രത്തിൽ വായിച്ചു. നിർമിതബുദ്ധിയിൽ പ്രവർത്തിക്കുന്ന കിയോസ്കുകൾ അവർക്ക് അവരുടെ ഭാഷയിൽ മറുപടി കൊടുക്കുമെന്നാണ് മന്ത്രി പറഞ്ഞത്. ഭാഷ തടസ്സമാകാതിരിക്കാൻ സഞ്ചാരികളെ സഹായിക്കാൻ AI Kiosk കൾ സ്ഥാപിക്കും -ദേശാഭിമാനി പത്രം - ജൂലൈ 12, 2024 ചില ചോദ്യങ്ങൾ ഏതെങ്കിലും വിനോദസഞ്ചാരകേന്ദ്രത്തെക്കുറിച്ച് നിലവിൽ സഞ്ചാരികൾ അറിയുന്നതും സംശയങ്ങൾ തീർക്കുന്നതും എങ്ങനെയാണ്? അതിൽ എന്ത് പോരായ്മകളാണ് ഉള്ളത്? ഇന്റർനെറ്റ് കണക്ഷനുള്ള മൊബൈൽ ഫോണുകളിൽ ലഭ്യമല്ലാത്ത എന്തു സൗകര്യമാണ് ഈ കിയോസ്കുകളിൽ ഉണ്ടാകുക? ഇന്റർനെറ്റിൽ ലഭ്യമല്ലാതിരിക്കുകയും എന്നാൽ കിയോസ്കുകളിൽനിന്നു മാത്രം അറിയാൻ കഴിയുന്നതുമായ എന്തെങ്കിലും വിവരങ്ങൾ ഉണ്ടോ?

This Month in GLAM: June 2024

Friday, 12 July 2024 02:31 UTC

Geneva, Switzerland — Yesterday, the Wikimedia Foundation, the nonprofit that hosts and supports Wikipedia and other Wikimedia projects, was again denied accreditation as a permanent observer to the World Intellectual Property Organization (WIPO) — the specialized United Nations (UN) agency that determines global policies on copyright, patents, and trademarks for its 193 Member States. 

Observer status would enable the Wikimedia Foundation to participate and contribute to WIPO committees where intellectual property norms are set. For the fourth time, China opposed the Foundation’s request for observer status, based, once again, on false accusations that the Foundation is complicit in spreading disinformation. China misrepresented Wikipedia’s volunteer-driven policies and practices, all of which are rooted in accuracy and neutrality and help effectively counter misinformation and disinformation online.

As the host of the world’s largest online encyclopedia, the Wikimedia Foundation has a material interest and deep, practical expertise in many of the issues of interest being discussed at WIPO, including traditional knowledge, copyright, access to knowledge during times of crises, and Artificial Intelligence (AI). The Foundation’s presence at WIPO would help to ensure that the future of copyright truly reflects the global and diverse needs of the internet. Given that the content on Wikipedia and other Wikimedia projects also play an essential role in training almost every large language model (LLM), the Foundation can offer valuable recommendations and unique insights as WIPO strives to understand and respond to the impact of AI on intellectual property rights.

“In the age of AI, Wikipedia is at the forefront of global copyright debates. Our experience at the Wikimedia Foundation can help WIPO Member States achieve meaningful policy transformations to protect open knowledge and content creation for the public interest,” said Stephen LaPorte, General Counsel of the Wikimedia Foundation. “We regret that the Foundation has once again been denied the opportunity to participate as observers at WIPO, especially on the basis of erroneous statements. We call on WIPO leadership to find a solution that can resolve this deadlock. Until then, we will continue to seek opportunities to represent open knowledge and the public interest at WIPO and beyond. Since 2022, our consultative status at the UN Economic and Social Council (ECOSOC) has allowed us to actively contribute to global initiatives like the Global Digital Compact, and we hope to one day share our expertise with WIPO as well.”

For 21 years, the Wikimedia Foundation has continuously contributed to country-level legislative processes on intellectual property, stressing the importance of balanced copyright laws for hosting content on Wikipedia and any other free and open online spaces designed for the public interest. Moreover, in times of crisis, conflicts, and pandemics, Wikimedia projects provide critical and reliable information that must remain available and be protected in forums like WIPO. 

The Foundation applied as a permanent observer to WIPO in 2020, 2021, 2023, and again this year, 2024. Our application was once again denied during WIPO’s General Assembly meeting based on a lack of consensus caused by China’s opposition. China has also previously blocked applications from Wikimedia affiliate groups and chapters seeking permanent or ad hoc observer status in WIPO. The Netherlands, as coordinator of the WIPO group of industrialized countries (which includes Australia, Israel, Japan, New Zealand, Norway, Turkey, the Holy See, and many European Union member states), the United States (US), France, Canada, Switzerland, and the United Kingdom (UK) expressed public support for the Foundation’s application. Supporting countries highlighted the Foundation’s valuable insights and experiences, demonstrating its involvement in global copyright issues and relevance to WIPO’s work. 

The Wikimedia Foundation is an active and respected contributor and shaper of policies and practices concerning access to knowledge and information around the world. We hope that UN Member States and WIPO leadership will act to help advance global access to free knowledge by enabling the Foundation’s observer status application to move forward in the near future.

About the Wikimedia Foundation

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content; support the volunteer communities and partners who make Wikimedia possible. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

For media inquiries, please contact press@wikimedia.org

The post Wikimedia Foundation’s Accreditation to World Intellectual Property Organization Blocked for a Fourth Time by China appeared first on Wikimedia Foundation.

Anne-Christine Hoff is an associate professor of English at Jarvis Christian University.

Back in January of this year, I took a three-week, six-hour introductory course on Wikidata through the nonprofit Wiki Education. Before the course’s start, I knew little to nothing about Wikidata, and I had several preconceived notions about the database and its uses before I began the course.

My first impression about Wikidata was that AI bots ran the system by sweeping Wikipedia pages and then used that information to create data sets under various pre-defined headings. In my conception, Wikidata’s information updated only when editors on Wikipedia changed or added pages. I thought of Wikidata as a closed system, and I thought the point of the course would be to learn how to run queries, so that we students could figure out how to access the data collected through Wikipedia. 

I remember asking my Wiki Education instructor about the role of AI in Wikidata, and he very pointedly responded that bots cannot program anything on their own. Instead, humans program Wikidata, and through this programming capability, both humans and machines can read and edit the system.

Anne-Christine Hoff
Anne-Christine Hoff
Image courtesy Anne-Christine Hoff, all rights reserved.

Wired writer Tom Simonite provided an example of this phenomenon in his article “Inside the Alexa Friendly World of Wikidata”:

“Some information is piped in automatically from other databases, as when biologists backed by the National Institutes of Health unleashed Wikidata bots to add details of all human and mouse genes and proteins.” 

This same article also discusses a further example, published in a paper by Amazon in 2018, of Wikidata teaching Alexa to recognize the pronunciation of song titles in different languages.

Both of these examples do a good job of illustrating another one of my misconceptions about Wikidata. As mentioned before, I thought the system was centralized and, apart from periodic updates, static. I did not conceive of the difference between data collected through documents (like Wikipedia) and a database with an open and flexible, relational communication system. 

What I discovered was vastly more interesting and complex than what I imagined. It was not a bot-driven data collecting system drawn from Wikipedia entries, but instead Wikidata was a communication system that can use multiple languages to add data. An editor in Beijing may enter information in Chinese, and that data will immediately be available in all the languages used by Wikidata. This feature allows for a self-structuring repository of data by users adding localized data from all over the world.

In 2013, Wikidata’s founder, Denny Vrandečić, wrote about the advantages that a database like Wikidata has over documents because “the information is stored centrally from where it can be accessed and reused independently and simultaneously by multiple websites without duplication.” In his article “The Rise of Wikidata,” Vrandečić made clear that Wikidata is not just a database for Wikipedia and other Wikimedia projects. It can also be used “for many different services and applications, from reusing identifiers to facilitate data integration, providing labels for multilingual maps and services, to intelligent agents answering queries and using background knowledge” (Vrandecic, 2013, p. 90). 

This raises the question as to how Wikidata intelligently reads the information stored on its platform. My first misconception had to do with my belief that Wikidata was a flat collection of data based on Wikipedia’s entries. What I didn’t understand is that the crux of Wikidata’s intelligence comes from its ability to understand data in a relational way. As noted in “Familiar Wikidata: The Case for Building a Data Source We Can Trust,” Wikidata’s semantic structure is based on rules, also known as Wikidata ontology. According to this ontology, a person may have a relationship to a “born in” place, but a place cannot have a “born in” relationship to other entities. For example, Marie Curie can be born in Warsaw, but Warsaw cannot be born in Marie Curie. 

This knowledge-based structure is the key to understanding how Wikidata’s identifiers are used to connect to one another. In Wikidata’s logical grammar, two entities connect to one another by a relationship, also known as a “triple.”  It is this triple structure that creates the structural metadata that allows for intelligent mapping.  A fourth item, a citation, turns each triple into a “quad.” The fourth item is crucial to Wikidata’s ability to further arrange the data relationally, by making clear where the data in the triple originates, then arranging the data hierarchically based on its number of citations. 

Having access to the Wiki Education dashboard, I was able to see the edits of the other students taking the class. One student whom I’ll call Miguel was adding missing information about Uruguayan writers on Biblioteca Nacional de Uruguay’s catalog. As of this writing, he has completed more than 500 edits on this and other subjects, such as the classification of the word “anathema” as a religious concept. Two Dutch archivists were adding material on Dutch puppet theater companies in Amsterdam and Dutch women in politics. An Irish student was updating information on a twelfth century Irish vellum manuscript and an English translation of the Old Irish Táin Bó Cúailnge by Thomas Kinsella. 

What I saw when I perused the subjects of edits was exactly what the article “Much more than a mere technology” mentions, that is, that Wikidata is capable of linking local metadata with a network of global metadata. This capability makes Wikidata an attractive option for libraries wanting to “improve the global reach and access of their unique and prominent collectors and scholars” (Tharani, 2021). 

Multiple sources contend that Wikidata is, in fact, a centralized storage database, and yet the intelligence of Wikidata makes this description ring hollow. It is not a database like the old databases for documents. Its ontological structure allows for it to understand the syntax of data and arrange that information relationally into comprehensible language. Like the example of the biologists from the National Institutes of Health who programmed bots who programmed Wikidata bots to add genetic details about humans, mice and proteins to external databases, it can also be programmed for uses on external databases. Its linking capabilities make it possible for librarians and archivists from around the world to connect their metadata to a network of global metadata. Its multilingual abilities have a similar decentralizing effect, allowing users to create structured knowledge about their own cultures, histories, and literature in their own languages. 

If you are interested in taking a Wikidata course, visit Wiki Education’s course offerings page to get started.


Explore the upcoming Wikidata Institute, Wikidata Salon, and other opportunities to engage with Wikidata at learn.wikiedu.org.

Trouble with some wikis

Wednesday, 10 July 2024 15:26 UTC

Jul 10, 15:26 UTC
Resolved - This incident has been resolved.

Jul 10, 15:18 UTC
Monitoring - A fix has been implemented and we are monitoring the results.

Jul 10, 15:05 UTC
Investigating - We are aware of issues with accessing some wikis, and we are investigating.


A statement from Wikimedia Australia
.


Wikimedia Australia (WMAU) and the WMAU Board would like to acknowledge and give thanks to the Movement Charter Drafting Committee (MCDC) for their hard work over many years to produce the current Movement Charter and the Supplementary Documents. WMAU strongly supports the need for a Movement Charter as a Movement Strategy priority and appreciates the huge contribution the MCDC have made towards achieving this.

WMAU strongly endorses the aims of the Movement Strategy to increase diversity and equity in representation and inclusive decision-making across the global Wikimedia community. Current centralisation of power in the Wikimedia Foundation and the 12 WMF Board of Trustees Members is not representative or equitable, and is no longer appropriate for a global public interest platform.

Despite the significant time and effort already invested in the Charter process, the WMAU Board does not believe this is reason enough to ratify the proposed model as is. Although the Movement Charter is moving in the right direction, the WMAU Board is concerned that the model as proposed leaves open too much potential for unintended consequences.

WMAU’s chief concerns are that the proposed model:

  • is complex and bureaucratic
  • does not provide appropriate mechanisms for review, evaluation and iteration
  • does not provide adequate mechanisms for oversight and ensuring transparency and accountability
  • does not make it clear how diversity, inclusion and representation will be achieved
  • does not adequately communicate the separation of responsibilities between the Global Council, Global Council Board, the Wikimedia Foundation and the WMF Board of Trustees, resulting in a lack of clarity in relation to the operation of the Global Council and the Global Council Board.

As a result the WMAU Board feel they cannot vote yes in good conscience. It is for these reasons the WMAU Committee has opted to abstain by making a blank vote.

We did not come to this decision lightly. We discussed the proposed model at length within the Board and with our Chapter membership at a public meeting. In reaching this decision, the WMAU Board wants to make it clear that we and the Chapter remain committed to supporting and promoting diversity, inclusion and representation in the Wikimedia community, and we support ongoing moves towards more equitable and inclusive decision-making with respect to all Wikimedia Movement Organisations. We support a renewed effort to improve the current Charter. To that end, we recommend the MCDC consider separating ratification of the Principles and the parts of the Charter outlining the roles of existing Movement Bodies from the far more ambitious proposal to set up a Global Council and Global Council Board. The WMAU Board endorses the Charter Principles and Values and welcomes the clarity the Charter provides on the roles of various Movement Bodies. Our concerns relate to the need for more consideration of the constitution, representation, resourcing, voting, transparency, accountability and amendment processes of the Global Council and the Global Council Board.

In particular, we are extremely concerned that the model is deliberately difficult to amend, with unclear review or evaluation processes. This is a major issue given the complexity of the structure that is being proposed. As others have noted, this directly contradicts the Recommendation of the Movement Strategy #10 Evaluate, iterate, and adapt. We would like to see a model that is more adaptable and open to oversight, evaluation and review to reduce the risks associated with introducing a complex and bureaucratic new layer of governance such as the Global Council.

Beyond the proposed Charter itself, the WMAU Board wishes to flag concerns with the ratification process as well. Legitimate questions can be raised as to the role of the WMF Board of Trustees in the ratification process. Specifically, we note that the voting arrangement effectively gives the WMF Board of Trustees a veto over the passage of the Charter. Regardless of how that is wielded, it undermines the legitimacy of the spirit of community based decision-making the Charter seeks to enact.

We are also concerned that the Board Liaisons Reflections published on Friday 21 June 2024 had a negative impact on the Charter ratification process. Whether intended or not, the Board Liaisons unduly influenced community discussion of the Charter (and likely how votes were cast) by publicly stating their recommendation that the WMF Board of Trustees not ratify the Charter because the release of that recommendation could reasonably be read as an announcement of how the WMF Board of Trustees intended to vote (whether their vote followed the recommendation or not does not matter). This action was counter to the MCDC’s request that the WMF Board of Trustee’s vote not be shared until after the vote of individuals and affiliates had concluded to avoid influencing the voting. Unfortunately, the release of the Board Liaisons' recommendation has been widely construed as a deliberate attempt to influence the vote. Whether that was the intention, it has been both the effect and the perception.

WMAU looks forward to working together with the different stakeholders on next steps in the ongoing journey towards better governance and decision-making for the global Wikimedia community.  

Wikimedia Australia Board

Documenting manhole covers in Spain

Tuesday, 9 July 2024 05:13 UTC

Fremantle

· Wikimedia · photography ·

A fascinating journey: 10 years of manhole cover photography from our community, 8 July 2024 by Sara Santamaria:

Documenting a manhole cover has become an essential part of the community’s trips and outings. Over the years, some members have developed an affinity for certain covers that they consider particularly representative. Mentxu Ramilo, for example, found a 1925 manhole cover in Vitoria-Gasteiz that she found fascinating. “I let myself be infected by the Wikimedian spirit and passions, and by everything that forms part of the graphic heritage and deserves to be documented,” explains Mentxu.

I think we of WikiClubWest are going to have to up our game of cataloguing of all the street things! :-)

Tech News issue #28, 2024 (July 8, 2024)

Monday, 8 July 2024 00:00 UTC
previous 2024, week 28 (Monday 08 July 2024) next

Tech News: 2024-28

weeklyOSM 728

Sunday, 7 July 2024 10:03 UTC

27/06/2024-03/07/2024

lead picture

SotM France 2024 – Lyon [1] | © OSM-France

Mapping

  • Marco Antonio mapped El Cardón National Park in Bolivia using official boundary data from PROMETA, an environmental conservation organization of Tarija, Bolivia.
  • Roxystar is currently mapping street lamps in Munich, complete with additional details such as the lamp’s height, to simulate the light coverage by using OSMStreetLight.
  • rtnf on Mastodon emphasised the importance of mapping building entrances to help people avoid getting lost, citing personal experience of having to circle a building to find the entrance. znrgl points out in the conversation that it is easy to record entrances with the Every Door at any time while traveling.
  • DENelson83 has completed a project to manually map all the forested areas on Vancouver Island from aerial imagery, improving the detail and accuracy of the island’s forested regions on OpenStreetMap.
  • Comments were requested on the following:
    • The proposal to deprecate crossing=zebra in favour of crossing:markings.
    • The proposal to introduce the volunteers: prefix for locations/features that have need of volunteers, including whether new volunteers are accepted, urgency of need, signup information, and benefits for volunteers.

Mapping campaigns

  • The Open Mapping Hub – Asia Pacific from HOT celebrated the winners of the Climate Change Challenge, recognising the efforts to generate valuable data through OpenStreetMap in 14 Asia Pacific countries. Special thanks were given to Open Mapping Gurus from Nigeria, Peru, and Niger, and the winning teams will soon receive their prizes. Countries mapped include Indonesia, India, the Philippines, Nepal, and more.
  • Pavy_555 visited JNTU Hyderabad, to promote smart mobile mapping using the Every Door app, emphasising community engagement and the importance of updating OpenStreetMap data with local amenities and micro-mapping efforts.
  • IVIDES.org is promoting a campaign > for the collaborative mapping of the Brazilian coastal and marine zones. The project uses OpenStreetMap and will be carried out to evaluate aspects related to the sustainability of this strategic region. Registration is open for participation in the pilot mapping and the research coordinator presents the initiative in her diary > .

Community

  • The OpenStreetMap community is invited to participate in WikiCon 2024, taking place from 4 to 6 October in Wiesbaden, Germany. Volunteers are needed to staff the OSM booth and promote the project to a wider audience. Travel and accommodation costs can be covered by FOSSGIS e.V. for participants from outside the Wiesbaden or Rhein-Main area. If you are interested, you can note this directly on the wiki page.

Events

  • [1] Bristow presents a photo retrospective of the 10th SotM France conference, held in Lyon from 28 to 30 June 2024. Attendance records were broken, with over 300 people taking part. Recordings of the presentations will soon be available online on PeerTube.
  • The deadline for early bird pricing for the 2024 State of the Map from 6 to 8 September has been extended till 31 July.
  • The FOSS4G Perth 2024 conference, scheduled for 23 October in conjunction with the ISPRS TC IV Mid-Term Symposium, has opened its Call for Presentations, inviting the open geospatial community to share insights on tools such as QGIS, PostGIS, and OpenStreetMap.
  • The State of the Map 2024 programme offers a diverse range of sessions, workshops, and lectures. The event will occur from 6-8 September, in Nairobi, Kenya, covering topics such as sustainable transport, local mapping initiatives, integration into academic curricula, and innovative data collection methods.

Education

  • OpenStreetMap contributor Denis_Helfer is organising an introduction to OSM on the 15 July in Strasbourg, France. This will likely be followed by a series of workshops in autumn.

Maps

  • JveuxDuSoleil is a web application that simulates urban shadows to help users find sunny terraces in cities such as Paris, Marseille, and Nantes. Users can zoom in on the map to see where terraces will be sunny at certain times. However, the project faces functionality issues as building models and their shadows are no longer generated due to maintenance issues.

OSM in action

  • The ‘Los Pueblos más Bonitos de España’ website offers a guide to the most beautiful villages in Spain, with resources such as an OpenStreetMap-based village map application for geolocalised travel and a guidebook for sale to help organise trips to these charming places.
  • The GLOBE programme’s data visualisation tool allows users to explore environmental data collected around the world, filtering by protocol, date range, and geographical location, with options to download and analyse specific datasets for educational and scientific purposes.
  • The Toll/ST Ceritapeta tool allows users to visualise and measure driving distances from various toll gates and train station in Jakarta, Indonesia on an OpenStreetMap background. This tool is utilized to aid decision-making when choosing a residential complex in the suburbs of the Jakarta Metropolitan Area, as driving distances to the nearest transportation infrastructures serve as a good indicator of connectivity.
  • The Naturkalender ZAMG map allows users to explore various natural observations, such as plant and animal phenology data. It provides detailed visualisations of seasonal changes and species distribution, supporting citizen science, and ecological research.
  • The Mosquito Alert map displays real-time reports of mosquito sightings and breeding sites submitted by users on an OSM background, contributing to public health research and control efforts. The interactive map allows users to explore mosquito data geographically, helping to track the spread and presence of different mosquito species.
  • Norbert Tretkowski navigated > around Norway using Organic Maps on a Google Pixel 3, detailing the app’s performance and challenges with features such as tunnel navigation, estimated arrival times, and ferry integration.
  • velowire.com displays the routes of the most important cycle races on OpenStreetMap maps and offers them for download.
  • NNG and Dacia have partnered to offer Dacia drivers OSM based navigation maps, providing a community-driven, frequently updated, and feature-rich map solution to enhance the driving experience.

Open Data

  • The Heidelberg Institute for Geoinformation Technology (HeiGIT) has made OSM land use data available on HeiData, providing TIFF tiles for EU countries and the UK. This data is derived from Sentinel-2 imagery and OpenStreetMap, which is classified into categories such as agricultural areas and urban regions using a deep learning model. The datasets can be used by urban planners, environmental researchers, and others for various applications.

Software

  • Badge(r)s is a location-based GPS game where players collect virtual items, quadrants, and regions, acting as both creators and collectors. Badges, the primary virtual items, appear on the map at specific coordinates or in players’ collections.
  • The June 2024 MapLibre newsletter announced two minor releases of MapLibre GL JS, progress on a Vulkan backend for MapLibre Native, and the release of Martin Tile Server v0.14. It welcomed new sponsors and highlights upcoming events including FOSS4G EU and State of the Map Europe.
  • Amanda details improvements and ongoing issues with WaterwayMap.org, including a new flow direction grouping feature, bugs in river bifurcation calculations, and gaps caused by geojson-to-vector tile conversion, and invites feedback and discussion from the community.

Programming

  • emersonveenstra introduced the ‘Rapid Power User Extension’, a new Chrome/Firefox extension that integrates with OpenStreetMap to redirect edit buttons to Rapid and add Strava heatmap support as overlay imagery. The extension is in early development, and users are encouraged to report issues and suggestions on GitHub.
  • Mark Stosberg explored the optimisation of Minneapolis’ low-stress bicycle network connectivity using spatial analysis for generating isochrones to measure bicycle travel distances within the network. He described his process using QGIS, JOSM, and Valhalla to create a customised routing network and generate multiple isochrones. The aim is to prioritise segments for improvement based on their impact on overall connectivity.
  • The new osmapiR package is now published at CRAN, the official repository for R packages. After almost one year of development and polishing, the package implements all API calls and includes a complete documentation with examples for all functions. With this publication and existing packages osmdata (implementing overpass calls) and osmextract (work with .pbf files), R is now a first class language to work with OpenStreetMap.

Did you know …

  • … the map 1NITE TENT, where private individuals offer overnight accommodation with a tent on their property? This is particularly useful in countries where wild camping is prohibited.
  • … about the different tools to convert opening hours into OSM syntax, display them, and fix any errors?

Other “geo” things

  • Robin Wilson has created a demo app for searching an aerial image using text queries like “tennis courts” or “swimming pool”. Under the hood, it extracts embedding vectors from the SkyCLIP AI model for small chips of the image and compares them using vector similarity metrics.
  • Cameroon and Nigeria have agreed to resolve their long-standing border dispute through joint on-the-ground verification and demarcation, with the aim of completing the process by the end of 2025 without recourse to the courts. The agreement, facilitated by the Cameroon-Nigeria Mixed Commission, focuses on areas such as Rumsiki, Tourou, and Koche, and addresses the challenges posed by Boko Haram terrorism in the region.
  • tlohde discussed the concept and application of average colors in digital maps, highlighting how averaging colors can simplify images while maintaining their recognisable features.
  • Grant Slater shared that he has updated the ZA-aerial with all the latest 25 cm resolution aerial photos, related to the national coverage of South Africa, provided by the South African National Geo-spatial Information (NGI). The full announcement can be found in the mailing list of the OSGeo Africa.
  • The initial release of the Panoramax Android app, announced at the State of the Map France 2024, offers an alpha/beta version available for download as an APK, and will be published on the Play Store and F-Droid. The app allows users to contribute geolocated photos to the Panoramax database, a free alternative to Google Street View for OpenStreetMap.

Upcoming Events

Where What Online When Country
Tartu linn FOSS4G Europe 2024 2024-06-30 – 2024-07-07 flag
中正區 OpenStreetMap x Wikidata Taipei #66 2024-07-08 flag
Lyon Pique-nique OpenStreetMap 2024-07-09 flag
München Münchner OSM-Treffen 2024-07-09 flag
San Jose South Bay Map Night 2024-07-10 flag
Salt Lake City OSM Utah Monthly Map Night 2024-07-11 flag
Bochum Bochumer OSM Treffen 2024-07-10 flag
Lorain County OpenStreetMap Midwest Meetup 2024-07-11 flag
Amsterdam Maptime Amsterdam: Summertime Meetup 2024-07-11 flag
Berlin DRK Online Road Mapathon 2024-07-11 flag
Wildau 193. Berlin-Brandenburg OpenStreetMap Stammtisch 2024-07-11 flag
Zürich 165. OSM-Stammtisch Zürich 2024-07-11 flag
Portsmouth Introduction to OpenStreetMap at Port City Makerspace 2024-07-13 – 2024-07-14 flag
København OSMmapperCPH 2024-07-14 flag
Strasbourg découverte d’OpenStreetMap 2024-07-15 flag
Richmond MapRVA – Bike Lane Surveying & Mapping Meetup 2024-07-16 flag
England OSM UK Online Chat 2024-07-15 flag
Missing Maps London: (Online) Mid-Month Mapathon 2024-07-16
Bonn 177. OSM-Stammtisch Bonn 2024-07-16 flag
Hannover OSM-Stammtisch Hannover 2024-07-17 flag
Łódź State of the Map Europe 2024 2024-07-18 – 2024-07-21 flag
Zürich Missing Maps Zürich Mapathon 2024-07-18 flag
OSMF Engineering Working Group meeting 2024-07-19
Cocody OSM Africa July Mapathon – Map Ivory Cost 2024-07-20 flag
Stadtgebiet Bremen Bremer Mappertreffen 2024-07-22 flag

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by Raquel Dezidério Souto, SeverinGeo, Strubbl, barefootstache, derFred, euroPathfinder, mcliquid, muramototomoya, rtnf.
We welcome link suggestions for the next issue via this form and look forward to your contributions.

Rewriting feed URLs

Sunday, 7 July 2024 05:41 UTC

Fremantle

· MediaWiki · Wikimedia · RSS · indieweb ·

I've finally got my RSS feeds back up and running. The issue ended up being the fact that I'm running my wikis in the site root directory, i.e. without the /wiki/ in the URL like Wikimedia sites have. I've never liked the redundancy of it, and especially with .wiki domains it looks a bit silly (e.g. freo.wiki/wiki/…).

I thought for ages that it was because of the precedence of RewriteRules within Directory sections vs those within VirtualHost sections, but that was a red goose chase. It was actually that MediaWiki prioritises the title it finds in PATH_INFO over one supplied in the query string, so /Foo?title=Bar is seen as having a title of Foo instead of Bar.

To fix it, I turned off $wgUsePathInfo, set the $wgArticlePath to include the full domain name (bad, perhaps; this might come back to bite me), and then appended the path info as ?title= in a rewrite rule. So the Apache config looks like this:

<Directory "/var/www/mediawiki">
        RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-f
        RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} !-d
        RewriteRule ^(.*)$ %{DOCUMENT_ROOT}/index.php?title=$1 [L,QSA,B]
</Directory>
<VirtualHost *:443>
        DocumentRoot /var/www/mediawiki
        RewriteRule ^/news.rss /index.php?title=Special:CargoExport&table=posts&… [NC,QSA]
</VirtualHost>

And the MediaWiki config like this:

$wgScriptPath = '';
$wgArticlePath = 'https://' . $_SERVER['SERVER_NAME'] . '/$1';
$wgUsePathInfo = false;