Indigenous peoples and the Wikimedia movement: Three takeaways from the Arctic Knot Wikimedia Language Conference
9 August 2021 by Mali Brødreskift (WMNO)
Logo: Arctic Knot Wikimedia Language Conference.
The United Nations’ theme for this year’s International Day of the World’s Indigenous Peoples is “Leaving no one behind: Indigenous peoples and the call for a new social contract.”
According to the UN, a social contract is “an unwritten agreement that societies make to cooperate for social and economic benefits.” The theme calls attention to the reality that many Indigenous peoples and communities were excluded from their countries’ social contracts from the start. It also includes a call to action:
“The new social contract must be based on genuine participation and partnership that fosters equal opportunities and respects the rights, dignity and freedoms of all. Indigenous peoples’ right to participate in decision-making is a key component in achieving reconciliation between indigenous peoples and States.”
United Nation’s International Day of the World’s Indigenous Peoples
How can we as contributors to Wikimedia projects contribute to the new social contract that includes Indigenous peoples?
What we learned at the recent Arctic Knot Wikimedia Language Conference is: In our movement, we have many tools, but the true magic will always be in the community.
The Arctic Knot conference is a part of the Celtic Knot series of annual convenings that invites participants to look at small and underrepresented languages and their use on the Wikimedia projects. In addition to those small languages experience, the Indigenous languages have a series of additional challenges. Arctic Knot provided space and focus for the Indigenous language communities to connect with eachother and with others in the movement.
Language spiral showing the diversity of languages spoken by the participants at the Arctic Knot Wikimedia Language Conference 2021. Among the 180 participants, we spoke 75 languages.
Here are three key takeaways from this year’s event to consider as we look to further support Indigenous peoples and languages on Wikimedia projects.
- Build with, not for
From Sámi people, we often hear stories on how they are used to seeing projects being launched to fix something in their community without their involvement or any interest in shared ownership over project timelines and results. This is a story we hear from many of our colleagues around the world working with Indigenous people. In the Declaration of the Decade of Indigenous Language 2022-2031, Indigenous peoples are put at the forefront with the slogan: “Nothing for us without us”.
This seems to come natural for Wikimedians. Throughout the conference, the main thing that was emphasized from everyone working with Indigenous peoples was that everything we do has to be on the premises of the Indigenous peoples. In the word of Oscar Costero, president of Wikimedia Venezuela:
“We are there for support, and working as a bridge so they can have this portal and use it as a sort of conductor.”
Oscar Costero, President Wikimedia Venezuela
Image of Liv Inger Somby, Sámi University of Applied Sciences, Guovdageaidnu, North-Norway. By Sabine Rønsen (WMNO), (CC BY-SA 4.0).
Very inspiring is also the work done by Wikimedia Canada and the Atikamekw. Among many important and interesting learnings, one was the importance of seeking support and approval from local leaders when they started working with Wikipetcia. That was not only a way to engage the community, but also showing respect for the social structure in the community.
- Understand the unique needs of different Indigenous communities
In the presentation of the Portal for Indigenous People, Oscar Costero described how they first thought that the Wayuu community needed their own language version of Wikipedia. However, their perspective changed when they actually visited with the community and realized that a vast majority of people in the community do not write their language.
They decided to reframe their work completely. The ability to understand the needs of the community, and find solutions accordingly is a key factor for success in the long term work of supporting Indigenous languages. The enthusiasm for the project seen among the Wayuu people also participating in the presentation shows how they are experiencing working with a tool that creates value and engagement in the community.
Even if each community will have different needs, the tools created along the way can be adjusted, repurposed, and improved. The portal for Indigenous peoples is created in an impressive collaboration across Latin-America, and is a brilliant example on just that. And through a space such as the Arctic Knot, even more people get to see what is being done, and hopefully the bridges will be not only from Wikimedians to local Indigenous communities, but between different groups of Indigenous peoples across the globe.
Our role as Wikimedians is not to decide how Indigenous peoples use Wikimedia projects, but to show the opportunities and work out good solutions together.
- Think outside of the Wikimedia box
To reach “a world in which every single human being can freely share in the sum of all knowledge,” we need to think beyond Wikimedia. Knowledge can be stored in pictures, archives, music, oral traditions, and craft practices to mention a few. To make this knowledge accessible, it is fundamental to build relationships with organizations such as archives and museums, as well as with groups and organizations within Indigenous communities.
Collaborations with Global Voices and Wikitongues are some examples. Wikimedia Norge’s involvement in the Unesco Decade of Indigenous Languages as well as reaching out to the Sámi parliament and Sámi archives are ways we can establish ourselves as a valuable alliance for the Sámi people.
The opening words from President of the Sámi Parliament in Norway, Aili Keskitalo, brought a strong sensation of purpose to the participants at the Arctic Knot Conference. Our hope is that this purpose reaches the whole Wikimedia community. Although not working directly with indigenous matters, your participation and engagement in the movement matters because of the community each one of us contributes to.
Opening speech by Aili Keskitalo at the Arctic Knot Conference
As a politician, a Sámi language user and a mother, I want us to have the same chances of using Sámi everywhere.
Aili Keskitalo, President of the Sámi Parliament.
The Wikimedia universe is full of amazing tools, but even more amazing are the people using those tools.
If the enthusiasm, openness, and positivity we experienced at the Arctic Knot Conference is representative for the whole Wikimedia community, this movement can provide a safe and productive space for Indigenous peoples. Among us they can set the premises of what they want to share about their cultures, as well as how they wish to define themselves and their histories.
Our opportunity is to be the bridge so we can reach the goal of the movement that is motivating so many contributors: everyone’s knowledge accessible for everyone.
Do you want to know more?
The take-aways described here are really just drops from the immense ocean of insight and experience shared at the conference. If you want to dive in deeper, check out the Arctic Knot conference program page where you can find direct links to each of the presentations from the main program and the video pool where you can find all video submissions that did not get a space in the program.
If you prefer going straight to YouTube, we have collected all contributions in this playlist.
At Wikimania 2021, you have a chance to learn more and get involved! Don’t miss the panel debate on Raising the voices of indigenous communities from Latin America or the workshop on a future Language Diversity Hub. Both are scheduled for Tuesday 17th of August.
Why not celebrate this day of the Indigenous Peoples with a unique soundtrack? Check out the Solidarity playlist put together by Subhashish Panigrahi for the Arctic Knot conference, which includes recommendations from participants at the Arctic Knot.
Here’s a quote from a story in a book (The Message in the Bottle - Wikipedia):
Why is it almost impossible to gaze directly at the Grand Canyon under these circumstances and see it for what it is — as one picks up a strange object from one’s back yard and gazes directly at it? It is almost impossible because the Grand Canyon, the thing as it is, has been appropriated by the symbolic complex which has already been formed in the sightseer’s mind. Seeing the canyon under approved circumstances is seeing the symbolic complex head on. The thing is no longer the thing…. it is rather that which has already been formulated—by picture postcard, geography book, tourist folders, and the words Grand Canyon.
That is an interesting list of books. ![]()
https://blog.archive.org/2021/08/09/back-to-school-with-the-internet-archive-fall-2021/
Back to School with the Internet Archive: Fall 2021
Posted on August 9, 2021 by chrisfreeland
Back in March 2020, teachers were asking themselves a nearly unthinkable question: “How are we going to get books in students’ hands with our schools & libraries closed?” We’ve heard from hundreds of teachers about the challenges they faced in connecting remote learners with books during COVID. Here is their story:
And here we are in August of 2021, with another school year about to start, and educators are still asking this same question. As a nonprofit dedicated to Universal Access to All Knowledge, the Internet Archive provides a number of free resources for parents, students, teachers, and librarians around the world. Check out these tools for remote learning:
Curated Collections
- Our site is packed with free, kid-friendly learning resources.
- Looking for ways to bring diverse representation into your classroom reading? Find books that support the LGBT+ community in Open Library.
- In 2015, ten-year-old Marley Dias set out to increase representation of books in which black girls are the main character with her #1000BlackGirlBooks campaign. Inspired by Marley, we want to support schools to make learning more inclusive. Find more than 300 of the curated titles in our library.
Lesson Plans
- Looking for lesson plans? Browse our collection to find detailed notes on hundreds of books and themes this summer, including Gulliver’s Travels and Don Quixote.
- Do your students struggle with math? Online tutor The Math Sorcerer has put together a list of math books and resources for self-studying, covering a range of topics and abilities. Borrow the books and help your students gain confidence with math.
Tips for Using Our Library
How long can I borrow a book? How many books can I check out at once? Find all the information you need to know about borrowing books from the lending library in our online tutorials and get reading!
A little late sharing, but those links are probably only going to get better over time, so stashing here for reference. ![]()
Wikipedia and Wikipeetia in Wayuú communities in the Colombo-Venezuelan Guajira
10 August 2021 by Gutemonik
The project “TIJITAALÜ WAYUU – WAYUÚ DIGITAL” seeks to support educational processes, carry out media and information literacy exercises, as well as develop and distribute open educational resources in educational centers of the Wayuú indigenous communities in Colombia and Venezuela, in the area of La Guajira.
Currently there are three areas in which we have been providing support and also building solutions together with teachers, directors and leaders of the Wayuú people, in relation to the use and appropriation of Wikimedia ecosystem projects. The areas of work in which we have been advancing are 1) Infrastructure, 2) Access to information and 3) Contributions to Wikimedia ecosystem projects.
The support in terms of infrastructure to the educational process in the framework of this project is given through the implementation and use of a local wireless network. Regarding access to information, through the Wayuu Digital Network we have been providing access to Wikimedia ecosystem projects, including Kiwix with Wikipedia in Spanish and other content in the .zim format. Additionally, we have a MediaWiki-based wiki that, in areas of limited internet access or total disconnection, allows us to bring, use and contribute content to Wikipeetia, the Wayuuanaiki version of Wikipedia.
The under-construction version of Wikipeetia in the Wikipedia incubator serves as a base and in turn complements this construction process. The activities around Wikipeetia have generated spaces and methodologies for contributions by teachers, directors, and Wayuú leaders to the projects of the Wikimedia movement and ecosystem, mainly to Wikipeetia.
Project Goals
Among the objectives set out in the project, there are two that are directly related to the Wikimedia ecosystem:
To ensure that teachers and students have access to the educational sites and can use Wikipedia in Spanish offline as an element that plays a very important role in supporting the educational process, and
That teachers and students can participate offline in the construction of Wikipeetia -Wikipedia in Wayuunaiki- in the educational centers as an exercise to support the educational process. Contributing to the use and revitalization of Wayuunaiki and the vindication of linguistic and cultural rights of the people in these contexts.
Context of Connectivity
Internet access conditions in the Wayuu context in Colombia and Venezuela are very diverse, however, since developing this project we have clearly learned that the indigenous communities we work with are located in areas where there is no connectivity, or very limited connectivity. In general, people depend on access through cell phones, devices that in turn have very limited coverage in rural areas. These connectivity circumstances are challenging, it’s a challenge to provide access to wikimedia ecosystem projects, which until now could only be used/visualized/improved with a stable Internet connection.
What is Intended
In the described context of little or no connectivity a local network, the EJE’IPAJIRAA TIJITAALÜ WAYUU – RED WAYÚU DIGITAL (based on La Red Local Kimera), installed in each educational site, allows us to provide support to the educational sites giving local access to a version of KIWIX with offline access to Wikipedia in Spanish and other options and contents that are in .zim format. The Local Network also provides access to educational resources and textbooks (including contents in Wayuunaiki) and some educational didactic tools.
To create the local network, equipment already available at the school or belonging to teachers is used as a server. Students’ cell phones are used as access devices. In contexts where there is no local network, any commercially available modem or router is used to create the network, even those already discarded with older ADSL technology.
Faced with the challenges of access and taking as a precedent the current situation of pandemic, where face-to-face presence has been limited, teachers need content to guide elementary and middle school students. This situation has led all actors in the education sector to look for options and opportunities in digital spaces, making both teachers and students more familiar with this format.
The version of MediaWiki installed on the local network facilitates the offline participation of teachers and students in the construction of an initial local version of Wikipeetia and at the same time, allows the advances in the editing process to be accessible to students and the community offline from any type of device with wireless access.
Use of MediaWiki in the Construction and Editing of Wikipeetia
The version of MediaWiki installed in the Wayuu digital Wayuu Network facilitates local access to the construction, editing, access and consultation of Wikipeetia. By incorporating MediaWiki in each local network, we have the option to create/simulate “local” versions of Wikipeetia in Wayuunaiki, facilitate and expand the number of editors of this, and eventually, think about bringing this experience to other indigenous languages.
In the case of Wikipeetia, the incubator project (which is only available online) was used to bring these contents to the local versions of MediaWiki of the Wayuu Digital Network. These local versions, which are different, are being complemented thanks to the contributions of the teachers participating in the project in the different educational centers. These local contributions will in turn be taken to the Incubator, and in this way the version in the incubator will be complemented.
It should be noted that the local versions in MediaWiki can be configured to provide additional options and opportunities in terms of content and content management. In addition to adjusting to the language in which you work, extensions have been installed that allow the inclusion of multimedia files (audio, image, video), editing with a visual editor, and a friendly interface with the end user. Regarding the inclusion of multimedia files, it is especially important in the case of indigenous languages, and particularly in the Wayuu communities, where orality and visuals are fundamental aspects within the cosmovision and culturally prevail over the textual.
Advances in the Contributions to Wikipeetia
Regarding the development of the work, from initial entries on MediaWiki we have led exercises where people have edited and added content adjusted to Wayuunaiki. In these exercises, the participants have prefered orality and showed appreciation for visuals. Additionally, in these local version the editing and citing rules are less strict which has led to a wide number of people interested in contributing.
We have foreseen that in order to fulfil the requirements, characteristics and formats to bring the articles and content created locally back to the Wikipeetia incubator and follow the rules around citation, relevance beyond the local contexts, encyclopedic relevance, specialized editors will be in charge of adjusting the articles and complex processes required.
Regarding how to move the articles among the different local networks to update them and bring them to Wikipeetia, the Wayuu Digital experience allows us to conclude that, within the limited connectivity allowed by WhatsApp and eventually with the implementation of physical means, the archives with the entries to the local network versions will be able to be transferred to the experienced editors so they can upload them to the incubator.
The people who have had access to the Local Network have been exploring and thinking about it based on their realities. We recently learned about a situation that occurred in the rural sector in Uribia, a municipality with limitations in terms of connectivity. One of the most experienced editors and translators of Wikipeetia, who makes contributions whenever his connectivity allows it. The editor installed the Wayuu Digital Network on his computer, and in an exploration of it, he found that he no longer depends on having connectivity to advance edits of Spanish Wikipedia and Wikipeetia entries, Mediawiki allows him to do this at any time.
Who We Are
Fundación Karisma is a civil society organization working to promote human rights in the digital world. The Innovation and Social Technologies Lab (Lab lTS) experiments primarily in small-scale contexts and local markets, under the framework of innovation and social technologies. It also prioritizes networking with organizations and individuals interested in different topics. Most of the current projects are being carried out in rural areas.
The Internet and Society Center of the Universidad del Rosario (ISUR) is an interdisciplinary space for research and training that works with a public interest and human rights perspective on issues related to the social challenges posed by technological changes. ISUR seeks to generate greater knowledge about the Internet and best practices by companies and States that promote respect for human rights, technological empowerment and the democratization of knowledge and information in Colombia and Latin America.
I am fascinated by their use-case, and love how it works. I am often without online access, and have built personal practices and systems to accommodate continual knowledge building (notice how I copy entire hypertexts, to facilitate learning?)
Also, this is such a baller statement:
Regarding the inclusion of multimedia files, it is especially important in the case of indigenous languages, and particularly in the Wayuu communities, where orality and visuals are fundamental aspects within the cosmovision and culturally prevail over the textual.
Cosmovision.
![]()
My work building a multilingual platform has been a gift and a curse: there is so much more knowledge to find in all languages, and I am driven to access it…
This is a very good background story about “tivoization” and what the stakes in that particular battle are. I misunderstood what it meant.
Understanding The Tivoization Rhetoric
So, what did TiVo do that was so objectionable? What was the behavior that Stallman went to work drafting GPLv3 to prevent that TiVo was allowed to do under GPLv2? It’s not, as others widely misreport, that TiVo forbade reinstallation “of the GPL’d software” itself. To my knowledge, TiVo never prevented such reinstallation. No one involved, including me, Stallman, TiVo, or anyone at FSF at the time believed that GPLv2 permitted TiVo to withhold the installation information for the GPL’d software itself. FSF demanded that TiVo provided its users the ability to reinstall Linux (and other GPL’d software, such as GNU bash). What TiVo later did, which some software freedom activists (including Stallman) found objectionable, was that TiVo designed the reinstallation process of that GPLv2’d software to cause the proprietary TiVo application to cease to function. I recall this being widely discussed when TiVo Series 3 was released in mid-2006, and my understanding was that all Series 3 devices had this particular anti-feature. (There were rumors that some of the Series 2 had this anti-feature as well, but not all models.) In other words, if you decided to modify your copy of Linux for the TiVo device and reinstall Linux, the TiVo userspace application would realize that cryptographic lockdown had been breached, and that proprietary software would no longer function. By exercising your reinstallation rights under GPLv2, you’d turn your TiVo DVR into a stand-alone server with some video processing equipment attached. You could use Kodi (which at the time had a different name) to turn that former-TiVo into a FOSS DVR, but your ability to use the proprietary DVR software from TiVo was lost — likely permanently.
Most have of course heard of the negative term “tivoization” that Richard Stallman popularized during the GPLv3 process — which was contemporaneous with the release of the TiVo Series 3. I nevertheless asked Stallman to not use that term — both then and many times since. I still disagree with Stallman’s policy position on the narrow issue of preserving proprietary userspace functionality. Specifically, I just don’t think it matters if, upon upgrading your copylefted software, that the proprietary software that was (to use GPLv2’s terminology) “merely aggregated” alongside the copylefted software continue to function. I felt and still feel that it’s actually better policy to break the (“merely aggregated”) proprietary software (as GPLv2 permits). My policy view is that this breakage inspires and encourages users to install a FOSS alternative for the userspace applications after they’ve reinstalled the FOSS operating system. Nevertheless, Stallman found this practice (using crypto lock-down to force the proprietary software to fail) illegitimate. He noted publicly that GPLv2 didn’t prevent this behavior, and wanted (and wrote, as explained below) a GPLv3 draft that prohibited that behavior.
The Principles of Open Scholarly Infrastructure
Governance
- Coverage across the research enterprise – it is increasingly clear that research transcends disciplines, geography, institutions and stakeholders. The infrastructure that supports it needs to do the same.
- Stakeholder Governed – a board-governed organisation drawn from the stakeholder community builds more confidence that the organisation will take decisions driven by community consensus and consideration of different interests.
- Non-discriminatory membership – we see the best option as an “opt-in” approach with a principle of non-discrimination where any stakeholder group may express an interest and should be welcome. The process of representation in day to day governance must also be inclusive with governance that reflects the demographics of the membership.
- Transparent operations – achieving trust in the selection of representatives to governance groups will be best achieved through transparent processes and operations in general (within the constraints of privacy laws).
- Cannot lobby – the community, not infrastructure organisations, should collectively drive regulatory change. An infrastructure organisation’s role is to provide a base for others to work on and should depend on its community to support the creation of a legislative environment that affects it.
- Living will – a powerful way to create trust is to publicly describe a plan addressing the condition under which an organisation would be wound down, how this would happen, and how any ongoing assets could be archived and preserved when passed to a successor organisation. Any such organisation would need to honour this same set of principles.
- Formal incentives to fulfil mission & wind-down – infrastructures exist for a specific purpose and that purpose can be radically simplified or even rendered unnecessary by technological or social change. If it is possible the organisation (and staff) should have direct incentives to deliver on the mission and wind down.
Sustainability
- Time-limited funds are used only for time-limited activities – day to day operations should be supported by day to day sustainable revenue sources. Grant dependency for funding operations makes them fragile and more easily distracted from building core infrastructure.
- Goal to generate surplus – organisations which define sustainability based merely on recovering costs are brittle and stagnant. It is not enough to merely survive, it has to be able to adapt and change. To weather economic, social and technological volatility, they need financial resources beyond immediate operating costs.
- Goal to create contingency fund to support operations for 12 months – a high priority should be generating a contingency fund that can support a complete, orderly wind down (12 months in most cases). This fund should be separate from those allocated to covering operating risk and investment in development.
- Mission-consistent revenue generation – potential revenue sources should be considered for consistency with the organisational mission and not run counter to the aims of the organisation. For instance…
- Revenue based on services, not data – data related to the running of the research enterprise should be a community property. Appropriate revenue sources might include value-added services, consulting, API Service Level Agreements or membership fees.
Insurance
- Open source – All software required to run the infrastructure should be available under an open source license. This does not include other software that may be involved with running the organisation.
- Open data (within constraints of privacy laws) – For an infrastructure to be forked it will be necessary to replicate all relevant data. The CC0 waiver is best practice in making data legally available. Privacy and data protection laws will limit the extent to which this is possible
- Available data (within constraints of privacy laws) – It is not enough that the data be made “open” if there is not a practical way to actually obtain it. Underlying data should be made easily available via periodic data dumps.
- Patent non-assertion – The organisation should commit to a patent non-assertion covenant. The organisation may obtain patents to protect its own operations, but not use them to prevent the community from replicating the infrastructure.
Cite as
Bilder G, Lin J, Neylon C (2020), The Principles of Open Scholarly Infrastructure, retrieved [date], https://doi.org/10.24343/C34W2H
OpenCitations is seeking to fully comply (and is most of the way, by point) with “POSI” (OpenCitations’ compliance with the Principles of Open Scholarly Infrastructure | OpenCitations blog).
What a fascinating document! I see this as a template to apply to other domains of human operation (because what else do we call them? Industries? Disciplines? Practices? Yes, all of those, please have principles of good activity in place!).
A little while ago I quipped:
And here is a document that presents:
Formal incentives to fulfil mission & wind-down – infrastructures exist for a specific purpose and that purpose can be radically simplified or even rendered unnecessary by technological or social change. If it is possible the organisation (and staff) should have direct incentives to deliver on the mission and wind down.
Inspired. ![]()
Salvaging indigenous languages in Ghana through the Founders’ day Ghana Writing Contest
11 August 2021 by Ruby D-Brown
Public Domain image
Did you know that one-third of the world’s 7000 languages are in Africa? While many of these languages are endangered, in West Africa alone, 50 indigenous languages are already on the verge of dissipating.
“In 2018, NUNESCO reported that over 300 African languages are endangered and more than 52 have become extinct. The report also stated that, if young people are not taught these languages, even more are likely to go extinct. Of the 230 languages around the world that have gone extinct since 1950, 37 were in Africa”
Like many parts of the world, Africa is faced with language endangerment. When a language ceases to be learned by young children, their days are obviously numbered. Which is what we see today, parents no longer see the need to speak their native languages to their children. Lack of recognition, documentation, digitization and general interest in native languages, and the dominance of English, French, Portuguese language, etc. — a residue of colonization amongst other factors, are contributing to the increasing disappearance of indigenous languages in Africa. Our history, culture are lost with the death of each language. It has been estimated that one language is likely to die every 14 days and if nothing is done about this, by the next century, nearly half of the world’s languages today will disappear, as indigenous languages continue to be abandoned in favor of other international languages.
It is said that “Languages hold a world of knowledge,”
Ewa Czaykowska-Higgins
and
“When a language dies, a world dies with it, in the sense that a community’s connection with its past, its traditions and its base of specific knowledge are all typically lost as the vehicle linking people to that knowledge is abandoned. This is not a necessary step, however, for them to become participants in a larger economic or political order.”
Stephen R. Anderson
It is incumbent on us not only to encourage articulation but also the documentation of information in indegeneous languages to save more languages from dying.
As part of measures to save or to encourage documentation of our indigenous languages, the Founders’ Day Ghana Writing Contest, an initiative of the Open Foundation West Africa in partnership with the Goethe Institute, has a special focus on languages this year. This layer of the campaign is dedicated to increasing content in indigenous languages on Wikipedia. Founders Day Ghana Writing Contest is an annual Writing contest in the month of August where participants contribute content about the founding and liberation stories of Ghana, its history and all contributions that have led to Ghana’s development today on Wikipedia. This year we are intensifying these efforts coupled with a layer of language contributions on the platform. So simply we are asking all contributors to create new or improve articles around the theme in both Ghanaian and official languages, in our attempt to decolonize not just the content but also the languages represented on the web.
In 2020 the first iteration of the local campaign garnered over 426 newly created articles from more than 70 participants. More than 30 of the participants who joined were new and were enrolled into our community.
In preparing participants for this year’s contest, we have already hosted 4 workshops, trained and empowered over 90 participants in our hubs across Ghana to make meaningingful contributions not only on English Wikipedia but also in their native languages. Throughout the month of August there will be a series of virtual office hours every Friday at 4:00 PM UTC dedicated to training participants to edit and translate articles on Wikipedia.
For more information about the contest and how to participate visit the campaign website Africa Day Campaign: Theme: Acceleration of the African Continental Free Trade Area | Open Foundation West Africa or Founders Day meta page.
“There are ways to recover, say tomato seeds, but language is an oral medium . . . it is gone if direct speakers are dead and nothing has been done to document it.”
Keren Rice-Linguistic Professor_University of Toronto
Entire worlds pass away with human languages…
Secret terrorist watchlist with 2 million records exposed online
By
Ax Sharma
- August 16, 2021
- 12:55 PM
- 1
A secret terrorist watchlist with 1.9 million records, including classified “no-fly” records was exposed on the internet.
The list was left accessible on an Elasticsearch cluster that had no password on it.
Millions of people on no-fly and terror watchlists exposed
In July this year, Security Discovery researcher Bob Diachenko came across a plethora of JSON records in an exposed Elasticsearch cluster that piqued his interest.
The 1.9 million-strong recordset contained sensitive information on people, including their names, country citizenship, gender, date of birth, passport details, and no-fly status.
The exposed server was indexed by search engines Censys and ZoomEye, indicating Diachenko may not have been the only person to come across the list:
An excerpt from exposed watchlist records (Bob Diachenko)
The researcher told BleepingComputer that given the nature of the exposed fields (e.g. passport details and “no_fly_indicator”) it appeared to be a no-fly or a similar terrorist watchlist.
Additionally, the researcher noticed some elusive fields such as “tag,” “nomination type,” and “selectee indicator,” that weren’t immediately understood by him.
“That was the only valid guess given the nature of data plus there was a specific field named ‘TSC_ID’,” Diachenko told BleepingComputer, which hinted to him the source of the recordset could be the Terrorist Screening Center (TSC).
FBI’s TSC is used by multiple federal agencies to manage and share consolidated information for counterterrorism purposes.
The agency maintains the classified watchlist called the Terrorist Screening Database, sometimes also referred to as the “no-fly list.”
Such databases are regarded as highly sensitive in nature, considering the vital role they play in aiding national security and law enforcement tasks.
Terrorists or reasonable suspects who pose a national security risk are “nominated” for placement on the secret watchlist at the government’s discretion.
The list is referenced by airlines and multiple agencies such as the Department of State, Department of Defense, Transportation Security Authority (TSA), and Customs and Border Protection (CBP) to check if a passenger is allowed to fly, inadmissible to the U.S. or assess their risk for various other activities.
Server taken offline 3 weeks after DHS notified
The researcher discovered the exposed database on July 19th, interestingly, on a server with a Bahrain IP address, not a US one.
However, the same day, he rushed to report the data leak to the U.S. Department of Homeland Security (DHS).
“I discovered the exposed data on the same day and reported it to the DHS.”
“The exposed server was taken down about three weeks later, on August 9, 2021.”
“It’s not clear why it took so long, and I don’t know for sure whether any unauthorized parties accessed it,” writes Diachenko in his report.
The researcher considers this data leak to be serious, considering watchlists can list people who are suspected of an illicit activity but not necessarily charged with any crime.
“In the wrong hands, this list could be used to oppress, harass, or persecute people on the list and their families.”
“It could cause any number of personal and professional problems for innocent people whose names are included in the list,” says the researcher.
Cases, where people landed on the no-fly list for refusing to become an informant, aren’t unheard of.
Diachenko believes this leak could therefore have negative repercussions for such people and suspects.
“The TSC watchlist is highly controversial. The ACLU, for example, has for many years fought against the use of a secret government no-fly list without due process,” continued the researcher.
Note, it is not confirmed if the server leaking the list belonged to a U.S. government agency or a third-party entity.
BleepingComputer has reached out to the FBI and we are awaiting their response.
Update 11:02 PM ET: The FBI had no comment on the matter.
Hahaha, Elasticsearch’s business model is to prevent this from happening. Don’t use Elasticsearch, folks.
https://rachelbythebay.com/w/2021/08/17/pop/
Asking nicely for root command execution (and getting it)
There was a SEV review meeting once upon a time, and in it, we had reviewed some incident where something bad happened involving something that ran as root. I don’t remember the finer points of that one any more, but the resident VP who ran the show wondered aloud how much “root exposure” we had in our infra in general. That got my attention.
I decided to try to get an answer to it. While in the meeting, I hit up my little dataset of everything running on any machine in production, then narrowed it down to anything running as root (that is, uid 0), and also with some (TCP) network listening ports. I figured those would be the easiest to “pop”. Think “SELECT … FROM … WHERE uid = 0 AND ports IS NOT NULL” type of thing.
After the meeting ended without using the whole 90 minutes, the room was still reserved for another 15-20 minutes or so, and a couple of people hung out to look at the list. One noticed that this one service running everywhere had these options for “pre” and “post” commands. This was a service which was normally used for performance measurement stuff and involved running an external command. You’d say “okay, analyze this thing for me”, and it’d crank for a bit and kick back a bunch of data points on whatever you had targeted. That much was fine, but unfortunately you could specify arbitrary commands to run before or after the request.
One of the people there came up with a test and built a command that should run “touch /tmp/(their name)-was-here”. Then they fired it off, and another person looked, and sure enough, the file had appeared in /tmp, owned by root. (I should note that this person didn’t have any magic permissions for that service, lest you think that’s what happened here.)
This was a pretty scary hole: you could ask it to run anything, and those commands would be run as root. Uh, sigh, yeah, that turned into a security SEV right there on the spot. Every damn machine running that software could be owned by anyone who sent a properly-formatted request to the service on the box.
Also, there was no log of this request. The person’s name was the sort of thing that was easily grepped, and it didn’t turn up anywhere. That means it didn’t log the actual command run (since it would have contained it), and it didn’t log the request which arrived over the network (which could have contained their username, but didn’t).
So now this meant we had a root hole basically everywhere, and no way to find out if it was ever exploited. It was a terrible situation. Then the next step was to go back in the source code and find out when the pre/post feature was added, when it shipped to production, and then do the date math, to find out how long things were vulnerable. When it turned out that it had been there for years, well, now we had no idea whether someone else found it, exploited it, and shoved off long ago, all with zero detection.
What else could happen to make it worse? Well, we found out that there’s Actual Business Stuff using this pre/post feature, and so we couldn’t just turn it off pending a fix, at least, not without breaking that “Stuff”. Fun, right?
Fortunately, when this happened, one of the people on the service’s team was available and responsive, and understood exactly how to thread the needle of closing the hole and not breaking the business. They built a list of every pre/post command that had been in actual use (by looking through the source tree for clients of that service, naturally), and then turned it into a hard-coded check list. The program would now refuse any pre/post commands unless it was one of those already known to it.
This change patched the hole and so was cherry-picked onto the last release and was pushed out rather quickly, and that was the end of that round of fun. Unfortunately, it was just the beginning of the general pattern of “oh hey, I bet I can pop that thing”, as a great many more were about to be found.
You’d think this would have touched off a new “golden age” of people being thoughtful about not running things as root unless absolutely necessary, and generally being paranoid about handling subcommands, but that’s not how it panned out in general. Instead, I’m pretty sure it pissed off the wrong people, and the rest is pretty much as you would expect.
Suffice it to say, if you work someplace with enough machines, there’s probably some way for you to get root on all of them if you can hit them with a handful of packets. I’ve seen it happen far too many times at enough companies to expect things to stay secure. I’m not talking about buffer overflows and stuff like that, although those exist too. I mean just straight up asking a service to please run a command for you (as root), and it gladly complies.
Maybe this is our version of the “infinite monkeys” thing: given enough software people, enough computers, and enough time, someone at a company will eventually grant universal remote root access to anyone who knows how to read some source code.
That last line cracks me up! ![]()
https://zenhabits.net/feel-scattered/
When Things Feel Scattered
By Leo Babauta
Often when our lives have a bunch of things going on at once, and multiple things to manage in each of those areas … it can feel really messy and scattered.
This kind of feeling of messiness can cause us stress, and make us unhappy with our current situation.
We might feel like we’re doing things wrong. We might feel like we’re trying to keep our heads above water, and struggling with it. We might feel helpless, like there’s not much we can do about it.
If you feel scattered like this … I’m here to say that this is a very common feeling, and you’re not alone. Many of us feel scattered, overwhelmed, like our lives are messy and out of control.
There are some tactical things we can do to feel more under control … and there’s a mindset shift (or practice) we can do to get good at feeling peace in the midst of this kind of chaos.
Let’s talk tactics first, then talk practice / mindset shift.
Tactical Methods
If your life is feeling scattered, there are some tactics that might help. I’m going to share some of them, and invite you to test them out to see which ones help you:
- Make a long list. Sometimes it can help to dump everything you need to do onto one long list. It’s simply a brain dump. Mentally scan all the areas of your life, and try to get everything out of your head and onto paper (or digital document). Don’t sort through them at first, just get everything out. Then take a moment to sort through — grouping them into areas, maybe prioritizing them. If you feel like it, knock off a bunch of the small tasks in 30 minutes to clear things out a little.
- Make a short list. Once you have a long list, it might feel even more overwhelming. No one can tackle such a long list all at once! I’ve found that it helps to make a short list of 3-5 things I want to tackle today, from the longer list. These are important things that would make today a great victory. Come back to the long list later, and focus on the short list for now.
- Create some time to sort out each area. Set aside just 20-30 minutes each day to sort through an area of your life — maybe 30 minutes on Monday to list out your financial tasks and clear some of them out. Maybe 30 minutes on Tuesday to make a plan and create structure around your health and self-care rituals. And so on — you can create time for different projects, for house maintenance, for family or relationship issues, etc. In this way, we start to get things in order, one area at a time.
- Create regular time for each area. Similarly, you can block off some time on the calendar each week for each area of your life that could use some regular maintenance. When will you take care of your finances? Meal planning and prep? For connecting with loved ones? For clearing out your email inbox? Block off the time on a recurring basis.
- Get some support. If you’re struggling with getting organized with everything, you might ask for a friend or colleague to sit down with you and help you sort through things. Or get a coach — I’m available for hire!
- Simplify. If things are getting crazy, sometimes it’s a time to pause and consider what you might do to simplify. Have you overcommitted, been too optimistic? Are there commitments you can let go of to give yourself more breathing room? It’s not always the right thing to do, but sometimes simplifying is a beautiful thing to do.
- Do one thing at a time, fully. If you’re feeling scattered, it’s often the case that you’re jumping from one thing to another in a kind of frenetic pace. It might be a helpful thing to slow down. Breathe. Pick one thing, and give it your entire focus. Pour your entire being into it, with full commitment. Then let it go and focus on the next thing.
- Take some breaks for self-regulation. When we’re feeling overwhelmed and scattered, it’s usually more of an emotional experience than it is a problem with our external circumstances. We’ll talk more about that in the next section on mindset … but a helpful tactic is to give yourself some breaks during the day where you can breathe, have some space, and take care of yourself. Regulate your emotions when you’re feeling anxious, overwhelmed, exhausted. Rest and give yourself time to replenish your energy.
OK, that’s enough of tactics! Don’t try to take them all on, just pick one and give it a shot for a bit, to see how it works.
Mental Shift: Practice with the Feeling of Chaos
We often think the problem is with our outer circumstances — we have too much to do, everything is messy! — or we think the problem is that we’re not good in staying on top of everything.
But there’s another approach, rather than changing external circumstances or getting better at doing everything right.
The approach is to learn to find peace with chaos.
It’s an acceptance that our lives will always be a bit chaotic, turbulent, messy. Our lives will never be in order. And so we can accept this chaos as not just a part of life, but the experience of life itself. This chaos is how life feels.
And then we can learn to relax, and find peace. Imagine finding calm while out in a stormy sea. Learning to love the storm itself.
So here’s how I suggest practicing with this:
- Write out a reminder to practice during the day — a note to yourself like, “Feel the chaos.” Then practice noticing when you’re feeling scattered, overwhelmed, messy.
- When you notice the feeling … pause. Take a breath. Bring your awareness to the bodily sensations of the messiness. Stay with these sensations for as long as you’re able, coming back to them if your mind gets caught up in thinking.
- Bring a gentle, open, non-judgmental awareness to the sensations of scatteredness. Can you be curious about these sensations, wanting to know more about them?
- See these sensations as simply how chaos feels for you right now. Can you learn to relax with these sensations? Can you learn to breathe and find gratitude for them?
If you practice with these sensations of messy chaos throughout the day, you can learn to get more and more comfortable with the chaos. You can learn to relax, and flow with how things are.
This doesn’t mean you should never get organized, or simplified. It means that you can find peace in the middle of just about any situation, with practice.
I basically live by a long list, so this is a daily practice for me. But I just added “feel the chaos” to it, that’s a pro-tip right there! ![]()
When it comes to protecting the words and pictures on a page, Wikipedia’s army of admins is pretty swift when it comes to taking action. But the need to protect templates has apparently been a blind spot for moderators. According to one admin in the form, after placing protections onto a separate oft-used template, a Wikipedia user revoked those protections not long after on the grounds that the template wasn’t popular enough to merit that particular safeguard. But in light of this latest attack, many admins are changing their tune.
“I didn’t realize templates used on tens of thousands of pages weren’t template-protected as a matter of course,” one of them commented. “Something that can vandalize 53,000 pages at once seems like a big gap in security.”
Oh damn! Template-hacked!
https://theoldnet.com/docs/httpproxy/index.html
TheOldNet Browser Proxy Instructions
What does it do?
Right now with theoldnet.com you can go to any archved web pages, but what you cant do is type say nintendo.com into the address bar and have it retrieve nintedo.com from 1996. Of course it wouldn’t, it will return nintendo.com from current day. By using a browser proxy, you can accomplish this. Any address you type into the address bar will return a website form the archive instead of a live, present day website.
Browser proxies were used back in the day in schools and institutions so that people accessing the internet would get a local, cached copy instead of retrieving the site from the internet directly. Theoldnet’s implimentation of a browser proxy makes use of this feature.
To begin
Grab your browser of choice, if you do not have one a great option is RetroZilla which you can get here
These steps are pretty universal for all Mozilla / Netscape browsers
Step 1 In Proxy Settings add the host of theoldnet.com and the port of 1996
Step 2 Add web.archive.org to the No Proxy for list.
Adding web.archive.org is required so that binary file downloads will work. Things like images and zip files.
For Internet Explorer follow these steps:
To access The Old Net Search Engine through the proxy simply visit theoldnet.com!
Advanced Settings
The proxy has a default starting year of 1996 and a default starting month of the current month. This means all search results will be the closest match to the current month, in 1996. This may end up being a site from 1998, or 2001, it all depends on what data is available.
If you wish you can choose a different default starting year by changing which port you connect to.
The valid ports/years are: 1996 - 2012
Terms of Use
Generally speaking, use the service free of charge for personal use.
If you are a business / commercial entity / social media presence, you should specify when you are using theoldnet.com’s HTTP proxy service in your material.
Such a neat service! ![]()
Broad idea:
In order to show that its library did no harm, IA says it wants to compare the commercial performance of books that were available for digital download with books that were not available for digital lending. Thus far, however, the publishers haven’t been prepared to offer data, at least to the extent requested by IA.
My callout:
According to IA’s letter, the publishers insist that producing data about all of their books would be unduly burdensome since there are only 127 books listed in the complaint. However, IA says that it doesn’t necessarily need every book to conduct a comparison and would be satisfied if the publishers provided data on each of those works and data on one or more comparable works that were not available for digital lending at the same time as those works.
What? How does that make sense? It’s only 127 books. How do they not have that information readily available? Do they no do their jobs at those publishers? (I happen to know some do and some do not!)
A story about someone using mosh and SDF to escape an elevator, due to it’s handling of high packet loss: https://mosh.org/elevator.txt
At https://mosh.org/ it says:
Mosh
(mobile shell)
Remote terminal application that allows roaming, supports intermittent connectivity, and provides intelligent local echo and line editing of user keystrokes.
Mosh is a replacement for interactive SSH terminals. It’s more robust and responsive, especially over Wi-Fi, cellular, and long-distance links.
Mosh is free software, available for GNU/Linux, BSD, macOS, Solaris, Android, Chrome, and iOS.
And I found this bit that’s quite interesting:
Same login method.
Mosh doesn’t listen on network ports or authenticate users. The mosh client logs in to the server via SSH, and users present the same credentials (e.g., password, public key) as before. Then Mosh runs the mosh-server remotely and connects to it over UDP.
Neato! Good to know know about, in case it’s required.
https://jlongster.com/future-sql-web
Wow.
Introducing absurd-sql
SQL is a great way to build apps. Especially small local web apps. Key/value stores may have their place in large distributed systems, but wow wouldn’t it be great if we could use SQLite on the web?
I’m excited to announce absurd-sql which makes this possible. absurd-sql is a filesystem backend for sql.js that allows SQLite to read/write from IndexedDB in small blocks, just like it would a disk. I ported my app to use and you can try it here.
This whole situation, and how well this project ended up working out, really is absurd. Why? In all browsers except Chrome, IndexedDB is implemented using SQLite. Anyway…
A huge thanks to phiresky writing the article Hosting SQLite databases on Github Pages which inspired me to do this. It’s a very clever technique and it gave me the idea to research this.
sql.js is already a great project for using SQLite on the web. It compiles SQLite to WebAssembly, and lets you read databases and run queries. The major problem is that you can’t persist any writes. It loads the entire db into memory, and only changes the in-memory data. Once you refresh the page, all your changes are lost.
While in-memory databases have their uses, it kneecaps SQLite into something far less useful. To build any kind of app with it, we need the ability to write and persist.
absurd-sql solves this, and it works by intercepting read/write requests from SQLite and fetching and persisting them into IndexedDB (or any other persistent backend). I wrote a whole filesystem layer that is aware of how SQLite reads and writes blocks, and it efficiently performs the operations correctly.
What this means is it never loads the database into memory because it only loads whatever SQLite asks for, and writes always persist.
The trade-offs are very interesting, given the use case:
What’s the catch?
Surely there’s a catch somewhere. Where are we paying the cost of those wins?
There is one downside: you need to download a 1MB WebAssembly file (actually I forgot about gzipping! it’s only 409KB after that). But that’s it. That’s the only catch. Everything else is an upside, and with WASM streaming compilation, the cost of parsing/compiling is relatively low. For real apps it’s a no-brainer, especially since it’s so cacheable (it never changes).
Not only do you get fantastic performance, you also get all the features of a great database:
- Transactions! (that don’t suck and try to auto-commit)
- A whole query system!
- Views!
- Common Table Expressions!
- Triggers!
- Full-text search!
- Caching (more major speedups)!
- So much more!
Here’s a demo of full text search. Click “load data” to load some data in, and then start typing in the input. If you refresh the page the data will still be there.
No really — how is this possible?
Whenever something looks too good to be true, I like to sit back and make sure I’m not missing something. Is there some catch I’m missing?
How can we beat IndexedDB so easily anyway? If this works, why don’t we do the same thing with native SQLite and chop up a SQLite db into SQLite?
The reason it works is because IndexedDB is so slow. The simple act of batching read/writes provides such drastic performance benefits, that no amount of CPU processing is going to close the gap. Because that’s the thing — we are doing a lot more CPU work! Shouldn’t that count for something negative?
No — because we save so much time avoiding IndexedDB reads/writes that the CPU hit is negligable. You don’t even see unless you look real close. Gaining that time gives us a lot of time to do everything SQLite does, and still be 5x (or much more) faster.
I’m really hoping for a better storage API like this one. Let’s hope it happens.
A Self-Sufficient Mind
By Leo Babauta
In a quiet room, we can find stillness. And in that stillness, we can contemplate our own mind.
What we often find is that the mind is very restless. It wants to take care of a thousand things, because it’s feeling some uncertainty and fear. It wants to fix problems, take care of all the undone things, figure out if everything is going to be OK. It wants to get all of our needs met, from survival needs to meaning, connection and love.
The mind is restless, wanting to fix everything, get everything it needs.
What if we could allow our minds to rest, settling into the full sufficiency of itself just as it is?
We would need nothing in each moment, other than what’s required for physical survival. That doesn’t mean we do nothing (though we could!) — beyond our needs, there might be a wholehearted desire to do some good for ourselves or others, but it doesn’t have to come from fear.
There’s a settledness, a peace, that can come with this kind of practice.
There’s a feeling that we are enough. That everything we need is already contained in us.
It’s a lifetime practice.
Here’s how I recommend starting:
- Sit in a quiet spot. Elevate your hips above your knees with a cushion, to give yourself more stability and comfort. Sit in an upright but relaxed posture. Eyes can be closed or slightly open with a soft downward gaze.
- Find stillness. Stay in this spot for at least 5-10 minutes, longer over time if you like. It doesn’t have to be long, but when you feel restless, stay for a little longer to practice with this restlessness.
- Rest in direct experience. Let your attention turn to the sensations of your body, the sensations of the present moment. These sensations are direct experience of the world. Rest your mind in this open awareness of direct experience, without needing to do anything but witness them.
- Observe the mind. Your mind will want to turn away from this direct experience. That’s because it feels unsettled. It wants to get its needs met, or fix problems or deal with uncertainties or fear. That’s OK! Watch the mind do its thing. What is it trying to fix? Notice the underlying fear or desire as the mind tries to do its thing.
- Appreciate the luminous quality of the mind. The mind is like an energy, trying to do its best to survive. It is unaware that it already is brilliant, abundant, enough. It is luminous and beautiful. We can start to appreciate these delightful qualities of the mind. This takes curiosity, appreciation, and lots of practice. Keep practicing.
Go and sit, practice, and let me know what you find!
This is a practice I use, though I use more movement, due to my sensitivity; there is rarely a place I can be that isn’t disruptive to my senses, so I do walking meditations.
Although, occasionally I find myself seated on a log or outcropping rock in the middle of a redwood forest, and there is stillness to be found, even in such an animated environment (the quieter we are, the more active the forest becomes…).
Slow Holidays
By Leo Babauta
For the past couple of Decembers, I’ve created a lot of spaciousness and slowness for myself. It’s a beautiful way to wind down the year and reflect on my life.
The holiday season can be a rushed affair for many people, but it doesn’t have to be. I’d like to share a few ideas in slowing things down for the holidays.
If you’d like to create a slow holiday season for yourself, it’s doable, with intention.
Here are some ideas:
- Create space. Our days tend to be filled automatically, whether it’s with work tasks, emails and messages, social media, calls and meetings, or just random Internet stuff. If we want space, we have to create it intentionally — block off a full day or weekend for time away from devices, have a work stop time in the late afternoon or early evening, create blocks of time for rest or walks, have intentional tea time, meals, meditation, reflection, journaling. Whatever calls you, create the space for it.
- Celebrate slowness. Whether you create space or are going about your work or personal life … what would it be like to do it more slowly, instead of rushing? Could you celebrate slow meals, slow days of reading, slow mornings or evenings, slow cooking or cleaning? Think of it as a leisurely way of being deliberate about your activities. You can have a slow hour or two with loved ones, just taking time to be together without technology.
- Simplify celebrations. This is a great time to reduce the amount of holiday celebrations you take on. If you would normally do office Christmas parties and multiple celebrations with friends and families … this is a good year to let most of that go. Consider keeping it as simple as possible, so there doesn’t have to be a lot of preparation, travel, stress.
- Simplify gift giving. What if you didn’t have to buy a ton of gifts? You’d reduce stress for everyone involved, reduce the amount you spend, and reduce the impact on the environment. Consider having a conversation with family and friends, to do an exchange that would reduce the number of gifts you give … you might also consider giving experience gifts, making them some food or other consumables, something that wouldn’t cost a lot nor add to the pile of things in their closets. Let the holidays be about spending time together, not consumerism.
- Savor spaciousness. Whenever you get a little bit of space, really allow yourself to savor it. Can you find the deliciousness in the outdoors, in a quiet morning of reading, in a 20-minute meditation sssion, in taking a luxurious nap? Savoring can be a theme of the season, instead of getting through things.
- Create simple rituals. What small rituals will help you slow down, be intentional, and savor? Perhaps a morning reading or journaling ritual, some meditation or yoga, or a daily walk? Slow silent meals without a device, or an evening reflection? These can be daily rituals that help you keep your intentions.
- Reflect in quietude. You might spend some quiet time each day, or each week, reflecting on your life. Reflect on how this year has gone, on your victories and lessons. Reflect on what has been coming up for you lately, and what you might learn from all of it. Reflect on what you want in life, and how you might take responsibility for creating it. Reflect on what you love most, what is most important to you, what you’re grateful for.
You don’t have to do all of these — packing a long todo list is perhaps not quite aligned with Slow Holidays! But I hope these give you a few ideas to consider.
I wish you delicious slowness and joyful quiet this holiday season.
https://zenhabits.net/slow-holidays/
I practice something more, hmmm, severe: I acknowledge holidays for their historical-cultural importance, but largely disengage from people in their personal practices. Or: I don’t celebrate holidays, I celebrate people, and the holidays get in the way of that, as they are too busy to know…
Sharing this so you know there is an alternative to your holiday rush, if that is what you experience. ![]()
Aw, crap.
New features in Neovim 0.5
This article brought to you by LWN subscribers
Subscribers to LWN.net made this article — and everything that surrounds it — possible. If you appreciate our content, please buy a subscription and make the next set of articles possible.
August 3, 2021
This article was contributed by Ayooluwa Isaiah
Neovim 0.5, the fifth major version of the Neovim editor, which descends from the venerable vi editor by way of Vim, was released on July 2. This release is the culmination of almost two years of work, and it comes with some major features that aim to modernize the editing experience significantly. Highlights include native support for the Language Server Protocol (LSP), which enables advanced editing features for a wide variety of languages, improvements to its Lua APIs for configuration and plugins, and better syntax highlighting using Tree-sitter. Overall, the 0.5 release is a solid upgrade for the editor; the improvements should please the existing fan base and potentially draw in new users and contributors to the project.
The Neovim project was started by Thiago Padilha in 2014 shortly after his patch to introduce multi-threading capabilities to Vim was rejected without much in the way of feedback. This event was the major trigger that led Padilha to create this fork, with the explicit aim of improving the usability, maintainability, and extensibility of Vim while facilitating a more open and welcoming environment.
A built-in LSP client
The Language Server Protocol is an open-source specification that standardizes programming language features across different source code editors and integrated development environments (IDEs). It facilitates communication between code-editing tools (clients), and locally running language servers to provide language-specific smarts such as auto-completion, find-and-replace, go-to-definition, diagnostics, and refactoring assistance.
Prior to the development of LSP, the work of providing support for a programming language had to be implemented for each IDE or text editor, either directly in the code, or through its extension system, which led to varying levels of support across language and editor combinations. The LSP standard enables the decoupling of language services from the editor into a self-contained piece so that language communities can concentrate on building a single server that has a deep understanding of a language. Other tools can then provide advanced capabilities for any programming language simply by integrating with the existing language servers.
While it was already possible to use LSP in Neovim with the help of third-party plugins, the 0.5 release adds native LSP support to Neovim for the first time. The introduction of LSP in Neovim allows the editor to act as a client, informing a language server about user actions (such as executing a “go-to-definition” command); the server answers the request with the appropriate information, which could be the location of the definition for the symbol under the cursor. That will allow the editor to navigate to the specified location in the file or project.
The interface provided by the Neovim LSP client is a general one, so it does not support all of the features that are available in third-party LSP plugins (e.g. auto-completion). It was built to be extensible, though, so it includes a Lua framework that allows plugins to add features not currently supported in the Neovim core. Setting up individual language servers for the editor can be done using the nvim-lspconfig plugin, which helps with the launching and initialization of language servers that are currently installed on the system. Note that language servers are not provided by Neovim or nvim-lspconfig, they must be installed separately. There is a long list of LSP servers supported by the nvim-lspconfig plugin.
Lua integration
Initial support for the Lua programming language in Neovim landed in the 0.2.1 release in 2017. It has seen continued development and deeper integration in the editor since then, most notably with the addition of a Neovim standard library for Lua in the 0.4 release in 2019. The Neovim developers expect Lua to become a first-class scripting language in the editor, thus providing an alternative to VimL, which is the scripting language inherited from Vim. Neovim 0.5 takes big strides toward the realization of this goal by improving the Lua API and adding init.lua as an alternative to init.vim for configuring the editor.
A good explanation of the rationale behind the decision to embed Lua in Neovim can be found in a video of a talk by Justin M. Keyes, a lead maintainer for the project. In summary, Lua is a more approachable language than VimL due to its simplicity and ease of embedding. It is also an order of magnitude faster than VimL. Neovim supports Lua 5.1, which was released in 2006, rather than more recent versions of Lua, such as 5.3 or 5.4 (released 2015 and 2020 respectively), mostly due to LuaJIT, which only supports Lua 5.1. The motivation for maintaining compatibility with LuaJIT stems from its significant performance advantages over the standard Lua compiler.
Adding Lua to Neovim has made it easier to extend the capabilities of the editor and contribute to its core code, especially for users who have been put off by VimL, which is not a language that is used outside of Vim. Since Lua is also heavily used for scripting video games and for extending other programs written in a variety of languages (C, C++, Java, etc.), there is an abundance of resources available for learning the language, along with examples that show how to use it to interact with APIs from other languages. This wealth of information on Lua makes it possible for new plugin authors and aspiring Neovim contributors to get up to speed with the language quickly.
The Lua support in Neovim has led to it becoming the preferred language for how newer Neovim features, such as the LSP client, are being exposed. Using these APIs can only be done with Lua, since VimL cannot be used to interact with them. However, VimL support in Neovim is not going anywhere, and the Neovim developers do not anticipate any reason to deprecate it, so migrating an existing init.vim configuration to init.lua, or porting a VimL plugin to Lua for the sake of it is completely optional at this time. The only caveat is that using these Neovim APIs (such as LSP or Tree-sitter) in an init.vim configuration or VimL plugins can only be done by embedding some Lua snippets within the existing VimL code.
Although deeper Lua integration is seen as one of the main achievements of the 0.5 release, not all of the reactions toward the push to supplant VimL in the editor core have been positive. There is some concern that the emphasis on Lua APIs, and Lua-only plugins, will lead to a split in the plugin community where an increasing number of plugins will be Neovim-only (as opposed to supporting both Vim and Neovim). Also, an improved and not entirely backward-compatible version of VimL (currently referred to as Vim9) is under active development by Vim creator Bram Moolenaar and other Vim contributors. It is not entirely clear whether the Neovim maintainers plan to support Vim9, since they are more invested in Lua. At the time of this writing, there are already several Lua plugins that work only in Neovim, and a handful of Vim9 plugins that work only in Vim. It is therefore easy to speculate that the ecosystems for both projects may diverge significantly in the near future as there are currently no plans to bring a similar level of Lua integration into Vim.
Tree-sitter
Tree-sitter is a new parsing system that aims to replace the limited, regular-expression-based, code-analysis capabilities that are prevalent in current developer tools. It is a high-performance parser generator that can build parsers to create an incremental syntax tree for a source file, and can efficiently update the syntax tree in realtime as the file is being edited. In Neovim 0.5, support for Tree-sitter has been added to the editor core, although it is currently classed as experimental due to some known bugs along with performance issues for large files. The expectation is that it will become stable in the next major release (0.6), which should be expected in a year or two judging from past releases.
Using Tree-sitter in Neovim makes it possible for the editor to understand the code in a source file as a tree of programming language constructs (such as variables, functions, types, keywords, etc.), and use that information to handle those constructs consistently. When a Tree-sitter parser is installed and enabled for a specific language, the editor’s syntax highlighting will be based on the syntax trees it provides; this results in improvements to the use of color to outline the structure of the code more clearly. In particular, object fields, function names, keywords, types, and variables will be highlighted more consistently throughout the file.
Tree-sitter is also able to do incremental parsing, which keeps the syntax tree up to date as the code is being edited. This puts an end to the practice of re-parsing an entire file from scratch in order to update its syntax highlighting after a change is made, which is currently the case with regular-expression-based highlighting systems. That leads to significant speed improvements.
Tree-sitter has been lauded for its improved syntax-highlighting capabilities, but it also enables the definition of language-aware text objects better suited to editing code than what is provided by default in the editor. The nvim-treesitter-textobjects module allows the creation of text objects for constructs like classes, functions, parameters, conditionals, and more, which can be manipulated just as easily as words or sentences. Several examples of the Tree-sitter-based highlighting can be seen in the gallery for the nvim-treesitter repository.
Wrapping up
The features above make up the bulk of this release, but Neovim 0.5 also includes improvements and bug fixes to the user interface, as well as smaller features such as support for remote plugins written in Perl 5.22+ on Unix platforms. It is also worth mentioning that around 1000 Vim patches were merged in this release, updating various aspects of the editor. The full list of changes, fixes and refinements can be seen in the release notes linked above.
The Neovim project uses GitHub issues to track all feature and bug requests, so a list of closed issues for the 0.5 milestone is available for a further exploration of the changes that made it into this release. The planning for subsequent releases is detailed on the project’s roadmap page, while priorities are tracked through GitHub milestones. Contributions from the community are welcome, of course, and the project maintainers may be reached via Gitter, Matrix, or the neovim room on irc.libera.chat.
Neovim 0.5 came out a while ago, but I’m still excited about the LSP server and treesitter integration. I’m working at neo(vi)m, one dream at a time, and I haven’t even begun to access the development options available to neovim.
Lots of links, but also talk about feelings, for those looking for such. My favorite bit: the image that shows, “COURAGE”.
















