• chevron_right

      JMP: Newsletter: JMP is 7 years old — thanks to our awesome community!

      news.movim.eu / PlanetJabber · 5 days ago - 02:51 · 2 minutes

    Hi everyone!

    Welcome to the latest edition of your pseudo-monthly JMP update!

    In case it’s been a while since you checked out JMP, here’s a refresher: JMP lets you send and receive text and picture messages (and calls) through a real phone number right from your computer, tablet, phone, or anything else that has a Jabber client. Among other things, JMP has these features: Your phone number on every device; Multiple phone numbers, one app; Free as in Freedom; Share one number with multiple people.

    Today JMP is 7 years old! We launched on this day in 2017 and a lot has changed since then. In addition to what we talked about in past years (see https://blog.jmp.chat/b/february-newsletter-2022 and https://blog.jmp.chat/b/february-newsletter-2023 for example), in the last year we’ve brought JMP out of beta, launched a data plan, and have continued to grow our huge community of people (channel participants, JMP customers, and many more) excited about communication freedom. So, in light of some vibes from yesterday’s “celebration” in some countries, we’d like to take this opportunity to say: Thank you to everyone involved in JMP, however that may be! You are part of something big and getting bigger! Communication freedom knows no bounds, technically, socially, or geographically. And you make that happen!

    Along with this huge community growing, we’ve been growing JMP’s staff as well — we’re now up to 5 employees working hard to build and maintain the foundations of communication freedom every day. We look forward to continuing this growth, in a strong and sustainable way, for years to come.

    Lastly, while dates have not been announced yet, we’re excited to say we’ll be back at FOSSY in Portland, Oregon, this year! FOSSY is expected to happen in July and, if last year is any indication, it will be a blast. We’d love to see some of you there!

    Thanks again to everyone for helping us get to where we are today. We’re super grateful for all your support!

    As always, we’re very open to feedback and would love to hear from you if you have any comments, questions, or otherwise. Feel free to reply (if you got this by email), comment, or find us on any of the following:

    Thanks for reading and have a wonderful rest of your week!

    • wifi_tethering open_in_new

      This post is public

      blog.jmp.chat /b/february-newsletter-2024

    • chevron_right

      Erlang Solutions: Why Elixir is the Programming Language You Should Learn in 2024

      news.movim.eu / PlanetJabber · 7 days ago - 15:25 · 5 minutes

    In this article, we’ll explain why learning Elixir is an ideal way to advance your growth as a developer in 2024. What factors should you consider when deciding to learn a new programming language?

    Well, it typically depends on your project or career goals. Ideally, you’d want a language that:

    • Is enjoyable and straightforward to use
    • Can meet the needs of modern users
    • Can offer promising career prospects
    • Has an active and supportive community
    • Provides a range of useful tools
    • Supports full-stack development through frameworks
    • Offers easily accessible documentation
    • Helps you grow as a programmer

    This article will explore how Elixir stacks up against these criteria.

    Elixir is fun and easy to use

    Elixir is fun and very user-friendly, which is an important long-term consideration. Its syntax bears a striking resemblance to Ruby. It’s clean and intuitive, making coding simple.

    When it comes to concepts like pattern matching and immutable data, they become your trusted allies and simplify your work. You’re also surrounded by a supportive and vibrant community, so you’re never alone in your journey. Whether you’re building web apps, handling real-time tasks, or just experimenting, Elixir makes programming enjoyable and straightforward, without any unnecessary complexity.

    How Elixir can meet modern usage demands

    Elixir’s strength in handling massive spikes in user traffic is unparalleled, thanks to its foundation on the BEAM VM, designed explicitly for concurrency.

    BEAM Scheduler

    While digital transformation brings about increased pressure on systems to accommodate billions of concurrent users, Elixir stands out as a reliable solution. For those curious about concurrency and its workings, our blog compares the JVM and BEAM VM, offering insightful explanations.

    Major players like Pinterest and Bleacher Report have recognised the scalability benefits of Elixir, with Bleacher Report , for instance, reducing its server count from 150 to just 5.

    This not only streamlines infrastructure but also enhances performance, allowing them to manage higher traffic volumes with faster response times. The appeal of a language that delivers scalability and fault tolerance is great for navigating the demands of today’s digital landscape.

    Elixir’s rewarding career progression

    Embarking on a career in Elixir programming promises an exciting journey filled with learning and progress. As the demand for its developers rises, opportunities for growth blossom across various industries. Mastering Elixir’s unique mix of functional programming and concurrency equips developers with sought-after skills applicable to a wide range of projects, from building websites to crafting complex systems. Plus, with more and more companies, big and small, embracing the programming language. As developers dive deeper into Elixir and gain hands-on experience, they pave the way for a rewarding career path filled with growth and success.

    When Elixir first emerged, its community was small, as expected with any new technology. But now, it’s thriving! Exciting events like ElixirConf in Europe and the US, EMPEX, Code Elixir LDN, Gig City Elixir, and Meetups worldwide contribute to this vibrant community.

    This growth means the language is always evolving with new tools, and there’s always someone ready to offer inspiration or a helping hand when tackling a problem.

    Elixir’s range of useful tooling

    Tooling makes languages more versatile and tasks easier, saving you from reinventing the wheel each time you tackle a new problem. Elixir comes equipped with a range of robust tools:

    • Phoenix LiveView : Enables developers to build real-time, front-end web applications without JavaScript.
    • Crawly : Simplifies web crawling and data scraping tasks in Elixir.
    • Ecto : A database wrapper and query generator for Elixir, designed for building composable queries and interacting with databases.
    • ExUnit : Elixir’s built-in testing framework provides a clean syntax for writing tests and running them in parallel for efficient testing.
    • Mix : Elixir’s build tool, which automates tasks such as compiling code, managing dependencies, and running tests.
    • Dialyzer : A static analysis tool for identifying type discrepancies and errors in Erlang and Elixir code, helping to catch bugs early in the development process.
    • ExDoc : A documentation generator for Elixir projects, which generates HTML documentation from code comments and annotations, making it easy to create and maintain project documentation.

    Elixir frameworks allow for full-stack development

    Given its scalability performance and its origins in Erlang, it is no surprise that Elixir is a popular backend choice. As mentioned above, Phoenix LiveView has provided an easy, time-efficient and elegant way for Elixir developers to produce front-end applications.
    Also, the Nerves framework allows for embedded software development on the hardware end. As a result, this is a language that can be adopted throughout the tech stack. This doesn’t just make it an attractive choice for businesses; it also opens up the door for where the language can take you as a developer.

    Elixir’s easily accessible documentation

    In a community that values good documentation, sharing what you know is easy. Elixir is all about that – they take their docs seriously, which makes learning the language easy. And it isn’t just about learning – everyone can jump in and help make those docs even better. It’s like a big conversation where everyone’s invited to share and improve together.

    Learning Elixir can make you a better programmer in other languages

    Many developers transitioning from object-oriented languages have shared their experiences of how learning Elixir has enhanced their programming skills in their main languages. When you dive into a new purely functional programming style like Elixir, it makes you rethink how you code. It’s like shining a light on your programming habits and opening your mind to fresh ways of solving problems. This newfound perspective sticks with you, no matter what language you’re coding in next. And if you’re a Ruby fan– Elixir’s syntax feels like home, making the switch to functional, concurrent programming super smooth.

    While everyone has their reasons for picking a programming language, these are some pretty solid reasons to give Elixir a try in 2024 and beyond.

    Ready to get started in Elixir?

    Getting started is simple.

    Begin by visiting the official “Getting Started” page. Additionally, you’ll find a host of free downloadable packages from our team at Erlang Solutions, available for Elixir.

    To immerse yourself in the community, ElixirForum is an excellent starting point. You can also explore discussions using the #Elixirlang and #MyElixirStatus hashtags on Twitter.

    Curious to learn more about what we do with the Elixir language? Keep exploring!

    The post Why Elixir is the Programming Language You Should Learn in 2024 appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/why-elixir-is-the-programming-language-you-should-learn-in-2024/

    • chevron_right

      Erlang Solutions: A Match Made in Heaven – Transactional Systems and Erlang/Elixir

      news.movim.eu / PlanetJabber · Thursday, 8 February - 06:59 · 4 minutes

    In the context of transactional systems, having a single process to manage each user interaction means any issues with one process remains contained and does not affect the rest, therefore, the system keeps running smoothly. This approach prevents the unfortunate situation where a solitary user’s problem could otherwise impact the entire platform, thereby preserving user trust and system integrity amid surges in usage.

    The phrase “let it crash” is very common among Elixir/Erlang developers, it’s not to disregard the significance of system crashes or errors; rather, it signifies the resilience of the system. Crashes or errors are often confined to within their respective processes, thereby averting a domino effect that could bring down the entire system. Recovery from these isolated incidents is often as straightforward as restarting the affected processes and that might be all that is required to get the processes up and running again, there are several strategies for managing how process failures are handled so there is some flexibility afforded to developers in that realm.

    Real-time responsiveness: Maintaining the pulse of live events

    Elixir, rooted in the robust framework of Erlang, was originally designed for developing telephony applications which demand swift responses within milliseconds, consistently and reliably. This characteristic perfectly aligns with the demands of transactional systems like a sports betting platform where instantaneous updates during pivotal moments are crucial. Elixir’s natural ability to manage these constant updates while maintaining minimal latency becomes a critical asset in such an environment. The Grand National, an event notorious for overwhelming bookmakers every year, is a perfect example of how Elixir’s real-time responsiveness can shine due to the avalanche of transactions occurring simultaneously in a very small window.

    Handling such monumental traffic volumes presents a significant challenge in transactional systems. High-profile events produce an overwhelming amount of transactions which necessitates a system capable of managing these surges without falling to its knees. Enter Elixir/Erlang, both of which are well known for their proficiency in handling surges without flinching. Discord, the instant messaging giant, exemplifies Elixir’s ability to handle escalating demands as they were successful in scaling to accommodate 5 million concurrent users with millions of events per second, vividly showcasing Elixir’s prowess.

    Concurrency and fault tolerance: Preventing disruptions in user experience

    Elixir/Erlang was built from a foundation designed for concurrency, simplifying the creation of concurrent systems with a fundamental emphasis on isolation, and fault tolerance. Its architecture revolves around processes that operate independently and communicate solely via message passing without sharing state. This feature ensures that processes will not interfere with one another, enabling individual processes to be monitored and revived in the event of failure.

    In the context of transactional systems, having a single process to manage each user interaction means any issues with one process remain contained and does not affect the rest, therefore, the system keeps running smoothly. This approach prevents the unfortunate situation where a solitary user’s problem could otherwise impact the entire platform, thereby preserving user trust and system integrity amid surges in usage.

    The phrase “let it crash” is very common among Elixir/Erlang developers, it’s not to disregard the significance of system crashes or errors; rather, it signifies the resilience of the system. Crashes or errors are often confined to within their respective processes, thereby averting a domino effect that could bring down the entire system. Recovery from these isolated incidents is often as straightforward as restarting the affected processes and that might be all that is required to get the processes up and running again, there are several strategies for managing how process failures are handled so there is some flexibility afforded to developers in that realm.

    Scalability: Seamlessly adapting to growing demands

    The concurrency model implemented by Erlang/Elixir allows for seamless vertical scaling to accommodate escalating demands without compromising service quality. As the resource requirements increase within a node, the system spawns more processes which ensures a consistent and reliable service even during rapidly increasing user loads. The scalability has been validated by the industry giant Bet365 as they were able to seamlessly increase users supported on a single node from tens to hundreds of thousands .

    Bleacher Report, the second largest sports website in the world, is another success story as they were able to handle 8 times their normal traffic without autoscaling, all the while using 8 serves in comparison to the 150 they were using before they used Elixir.

    While concurrency isn’t unique to Elixir and Erlang, their strength lies in leveraging the power of the BEAM virtual machine – a time-tested, battle-hardened system designed explicitly for concurrent, fault-tolerant, and real-time applications. As surely as the sun rises, the synergy between Elixir/Erlang and the BEAM virtual machine embodies a level of reliability and resilience that will not falter.


    In conclusion, companies that require highly transactional systems call for a platform that is capable of managing extreme data loads, ensuring constant responsiveness, and remaining resilient in the face of unpredictability. Erlang and Elixir, through their unique strengths and support from the BEAM, stand out as the ideal solution, not just for sports betting but for any industry facing similar demands for reliability, scalability, and real-time processing.

    The post A Match Made in Heaven – Transactional Systems and Erlang/Elixir appeared first on Erlang Solutions .

    • chevron_right

      Erlang Solutions: What Is the Fastest Programming Language? Making the Case for Elixir

      news.movim.eu / PlanetJabber · Thursday, 1 February - 10:33 · 12 minutes

    In the realm of technology, speed isn’t merely a single factor; it’s a constant way of life. Developers frequently find themselves needing to rethink solutions overnight, underscoring the importance of being able to swiftly modify code. This agility has become indispensable in modern development, especially when evaluating the fastest programming language.

    Because of this, finding the right language is a recurring obstacle for both developers and business owners. Regardless of your use case, Elixir consulting can be one proven way to harness one of the fastest programming language options available today.

    But defining what “the fastest programming language” means in the context of development can be just as complicated. To better understand adaptability and speed in coding languages, we’ve outlined how this should be determined, alongside some of the leading trends that continue to disrupt the concept of fast programming at present.

    What determines a programming language’s speed?

    Several factors go into determining which programming language is the fastest. It’s first important to note that the quality of your code, and the skill of the programmer behind it, matters more than the specific language you’re using. This is why it’s crucial to work with talented, experienced developers well-versed in their respective languages.

    However, there are factors which impact how efficiently a code can be implemented. One example is multi-threading, or concurrency. Concurrency means you’re able to perform multiple complicated tasks at once; languages with this capacity are therefore often more versatile, and faster, as a result.

    Another core way in which languages differ in terms of speed is whether they’re compiled or interpreted languages.

    Compiled vs interpreted languages

    All programming languages are written in human-readable code and then translated into machine-readable code so they can be executed. The way this information transfer occurs can however have a big impact on both flexibility and speed.

    Interpreted languages are read through an interpreter which then translates the code. Conversely, compiled languages allow the machine to directly understand code without an interpreter.

    A simplified way of thinking about this is to see interpreted languages as a conversation between two people who speak different languages, with an interpreter translating between them. Meanwhile, compiled languages are more like a conversation between two people who speak the same language.

    In practice, this means compiled languages can be executed faster than interpreted languages because they don’t require a translation step.

    Compiled v interpreted language

    It also means programmers can be more flexible when using compiled languages, as they have more control over areas like CPU usage.

    Is Elixir one of the fastest programming language options?

    Elixir is a compiled language, which means it has several efficiency benefits when compared with interpreted languages like Python and JavaScript, among others.

    Elixir programming is also a process that was initially designed with concurrency in mind. This means programmers can easily use multi-threading, allowing them to build complex solutions more effectively. Elixir’s benefits also extend to fault tolerance; whilst not directly improving speed, the ability to keep systems functional makes solutions more reliable and allows developers to solve problems in a targeted way.
    When combining these features with Elixir’s scalability, it becomes one of the fastest programming language options available to developers today.

    Top contenders for the fastest programming language

    In the present day, a plethora of programming languages are available for use, with developers continually innovating and introducing new ones. The effectiveness of a programming language often hinges on its design, usability, efficiency, and applicability.

    It’s essential to grasp the factors influencing the performance of a programming language. Parameters such as execution speed, memory utilisation, and adeptness in managing intricate tasks are pivotal considerations for developers assessing language proficiency.

    That said, let’s delve into the contenders.

    Python: versatility and speed

    Python is a widely used programming language that is great for building highly scalable websites for users:

    Readability and simplicity: Python boasts a syntax engineered for readability and ease of comprehension, prioritisng code clarity and maintainability. Its straightforward and intuitive structure allows developers to articulate concepts concisely.

    Abundant libraries and frameworks: Python boasts a rich ecosystem of libraries and frameworks that streamline various web development tasks.

    Thriving community: Backed by a thriving and expansive community of developers, Python experiences continual growth and support.

    Scalability and performance: Python garners acclaim for its scalability and performance, allowing it to manage high-volume web applications.

    Integration and compatibility: Python seamlessly integrates with various technologies, affording flexibility in web development endeavours.

    Swift: the speed of Apple’s innovation

    Swift in mobile app development

    Central to iOS app development is Swift, Apple’s robust and user-friendly programming language. The goal of Swift app development was simplification. Swift’s succinct and expressive syntax empowers developers to craft code that is both cleaner and easier to maintain.

    The main drivers behind its increasing popularity are:


    Benefits of SWIFT Language for iOS Development

    Enhanced syntax and readability: Swift boasts a concise syntax, making it easy to understand and work with.

    Reduced maintenance: Swift streamlines the coding process and operates independently of other programming databases, leading to high efficiency.

    Minimised error probability: With Swift, the likelihood of coding or compiling errors is significantly decreased. It emphasizes safety and security.

    Interactive playground: The Swift Playground feature enables developers to experiment with coding algorithms without having to complete the entire app, enhancing creativity and coding speed.

    High performance: Swift excels in speed compared to other programming languages, resulting in lower developmental costs.
    Open source: Swift is freely available and allows for extensive customisation based on individual needs.

    Ruby: Quick development and easy syntax

    Ruby on Rails for web applications

    Ruby on Rails (or Rails) is known for its capacity to streamline web development, Rails emphasises efficiency, enabling developers to achieve more with less code compared to many other frameworks.

    Building apps quickly and easily: Rails focuses on quick prototyping and iterative development. This approach minimises bugs, enhances adaptability, and makes the Rails application code more intuitive.

    Open-source libraries: Ruby on Rails has plenty of ready-made libraries available. These libraries enable you to enhance your web application without starting from scratch. The supportive Rails community often improves these tools, making them more accessible and valuable, with ample community support on platforms like GitHub.

    Simple Model View Controller (MVC): Long-time fans of Ruby on Rails swear by the MVC architecture. Thanks to MVC, it’s incredibly time-efficient for Rails developers to create and maintain web applications.

    Reliable testing environment: Rails applications come with three default environments: production, development, and test. These environments are defined in a simple configuration file. Having separate tests and data for testing ensures that it won’t interfere with the actual development or production database.

    Flexible code modification and migration: Ruby on Rails has flexibility in modifying and migrating code. Migration allows you to define changes in your database structure, making it possible to use a version control system to keep things in sync. This flexibility is great for scalability and cost-effectiveness because you don’t have to overhaul your source code when migrating to another platform.

    Kotlin: a modern approach to speed

    Kotlin in Android development

    Kotlin is a versatile programming language that works on various platforms. It meets Android app development requirements, especially since it’s a supported language for crafting Android app code.

    Kotlin: The official programming language for Android

    Streamlined Android app development: Kotlin presents a more efficient approach to creating Android apps, with a compact library that keeps method counts low.

    Simplified code and enhanced readability: Kotlin shortens code and improves readability, reducing errors and expediting coding processes.

    Open-source advantage: Being open-source ensures consistent support from the Kotlin Slack team, fostering high-quality development.

    Ease of learning: Kotlin proves to be a user-friendly language for beginners, with easily understandable code that empowers developers to solve problems creatively and effectively.

    Increased productivity and accelerated development: Adopting Kotlin leads to heightened productivity and faster development. Safety features like null safety reduce bug occurrences, resulting in quicker debugging and maintenance.

    Java: A balanced blend of speed and functionality

    Java in enterprise solutions

    Java’s “write once, run anywhere” capability makes it a top choice for enterprise software development, offering extensive support across diverse platforms and operating systems.

    This feature enables developers to write code once and execute it across various environments, resulting in significant time and cost savings while minimizing maintenance requirements. In the realm of IT, Java’s cross-platform compatibility ensures seamless operation across platforms like Windows, Mac OS, and Linux, making it particularly well-suited for enterprise needs.

    Security: Paramount in enterprise applications, and Java’s architecture offers robust security features to protect both data and applications, ensuring the integrity of business operations.

    Multithreading: Java’s multithreaded environment enhances performance by enabling faster response times, smoother operations, and efficient management of multiple requests simultaneously. This not only boosts productivity but also reduces development challenges for enterprise applications handling numerous threads.

    Simplicity to use: The simplicity and flexibility of Java coding, coupled with its user-friendly interface, streamline the development process. Additionally, Java’s reusable code promotes efficiency, allowing enterprises to leverage existing codebases for developing new software applications while ensuring ease of maintenance.

    Stability: Renowned for its stability, Java stands as one of the most reliable programming languages, capable of managing errors without compromising the entire application. This stability fosters trust among companies seeking a dependable language to deliver a seamless customer experience.

    Availability of libraries: Java’s vast library support empowers developers with a plethora of resources to address various challenges and fulfil specific functionalities, further enhancing its appeal for enterprise development projects.

    Comparing speeds: fastest languages programming

    From powering high-performance applications to ensuring swift response times in web services, the programming language used can significantly impact the efficiency and effectiveness of a project. In this exploration of programming languages. Let’s uncover the strengths and capabilities of each language in delivering optimal performance across diverse domains.

    C++: the powerhouse of performance

    C++ in game and system development

    In gaming, where milliseconds matter, C++ allows developers to fine-tune performance for smooth gameplay and stunning graphics. Similarly, in system programming tasks like operating system development, C++’s speed and efficiency ensure responsiveness and reliability.

    C#: versatility in the .NET framework

    C# in desktop and web services

    C# shines in desktop and web service development, offering a balance of speed and versatility within the .NET framework.

    While not as low-level as C++, it excels in building responsive desktop applications and powerful web services. With features like just-in-time compilation and memory management, C# enables developers to create applications that perform well and scale seamlessly, whether on the desktop or in the cloud.

    Lesser-known speed demons

    Exploring languages like Assembly, Lisp, and Go

    Beyond the mainstream languages, there are lesser-known options that excel in terms of speed. Assembly, known for its direct hardware manipulation, is a go-to choice for projects requiring maximum performance, such as embedded systems and real-time applications. Lisp, with its powerful macro system, allows developers to optimise code for specific tasks, resulting in highly efficient programs. Go, a relatively newer language, offers simplicity and built-in concurrency features, making it ideal for tasks demanding speed and scalability.

    JavaScript and PHP: Dominating the web

    Scripting languages in web development

    JavaScript and PHP have become foundational in web development, powering a vast majority of websites and web applications. Despite their scripting nature, they have evolved to deliver impressive speed and performance, driving innovation on the web. JavaScript’s advancements in browser technology, including just-in-time compilation, have elevated its performance to near-native levels, enabling the creation of complex client-side applications. Similarly, PHP has evolved into a robust platform for server-side web development, with features like opcode caching and asynchronous processing enhancing its speed and scalability. Together, JavaScript and PHP form the backbone of the web, enabling dynamic and interactive experiences for users worldwide.

    The future of fast programming

    As with all facets of technology, the nature of fast programming is evolving every day. Several trends and innovations are set to transform the concept of efficiency in programming in the coming months.

    Emerging trends in programming speed

    Compiled languages remain more efficient than interpreted languages in general, but this gap is steadily closing. This is thanks to what’s known as “just-in-time compilation ”, also known as dynamic compilation, which is a method designed to improve efficiency in interpreted languages.

    Open source development is another important trend when considering how the fastest programming language argument will evolve. These are situations where code is made freely available to everyone so that developers can learn collaboratively. Open source plays a key role in improving programming speeds across the industry, as it means all developers have access to new methods that can be studied and standardised. Languages with larger open source communities may therefore become more efficient over time.

    Both low-code and no-code programming have also become more prominent in recent years. These approaches are no substitute for fully coded applications created by experienced developers, but they do evidence the continued focus on speed and efficiency gains in software development today.

    Innovations and future predictions

    At the moment, AI’s role in programming is mostly speculative. But as the technology evolves, both AI and machine learning may further disrupt the efficiency potential of programming languages.

    One common prediction is for AI to be able to automate some of the more repetitive coding tasks, by analysing coding patterns and then generating short lines of code. In theory, this will reduce the time programmers spend on repetitive tasks, allowing them to experiment and focus on more detailed parts of programming. AI simply isn’t reliable enough to provide this level of support across the profession yet, but that may change in the coming years.

    Speed in programming isn’t simply about developing initial builds quickly, it also concerns the ability to scale at speed. Scalability potential in programming languages will therefore continue to play a pivotal role in their selection for advanced systems in the future.

    Finally, coding practices designed to streamline and automate the process of programming, like implementing CLIs (command-line interpreters), will continue to play a role in programming speed gains. Being versatile is already a key part of a programmer’s job description, but being able to write efficient, lean code will likely grow in importance as speed and scalability both remain core priorities.

    Choosing the fastest programming language for your needs

    Determining which programming language is the fastest is dependent on your individual use case. If you’re looking to create a web solution, for example, you’d need to be specifically looking for the fastest web programming language.
    If you’re working with complex, distributed systems that need a high level of fault tolerance and the ability to scale, Elixir is the ideal language to work with. Find out more about its efficiency potential on our Elixir page , or by contacting our team directly.

    The post What Is the Fastest Programming Language? Making the Case for Elixir appeared first on Erlang Solutions .

    • chevron_right

      Ignite Realtime Blog: Non-SASL Authentication Openfire plugin 1.1.0 released!

      news.movim.eu / PlanetJabber · Tuesday, 30 January - 19:35

    We’ve just released version 1.1.0 of the Non-SASL Authentication plugin for Openfire! This release fixes a compatibility issue with Openfire 4.8.0.

    The Non-SASL Authentication plugin provides an implementation for authentication with Jabber servers and services using the jabber:iq:auth namespace, as specified in XEP-0078: Non-SASL Authentication .

    Note Well: The protocol implemented by this plugin has been superseded in favor of SASL authentication as specified by the XMPP standards in RFC 3920 / RFC 6120, and is now obsolete. This plugin should not be installed in Openfire, unless there is a pressing need for backwards compatibility with regards to XEP-0078.

    The update should be visible in the Plugins section of your Openfire admin console within the next few days. You can also download it from the plugin’s archive page .

    For other release announcements and news follow us on Mastodon or X

    1 post - 1 participant

    Read full topic

    • wifi_tethering open_in_new

      This post is public

      discourse.igniterealtime.org /t/non-sasl-authentication-openfire-plugin-1-1-0-released/93553

    • chevron_right

      Erlang Solutions: 5 Key Tech Priorities for Fintech Leaders in 2024

      news.movim.eu / PlanetJabber · Thursday, 25 January - 10:42 · 7 minutes

    In the fast-paced world of financial tech, staying on top isn’t just about seeing ahead—it’s also about committing to evolving strategies. For CTOs leading the charge, we’re taking a closer look at the 5 key things they should focus on in 2024, building on what we talked about in 2023 .

    If you caught our last piece, you’ll know the landscape has changed, bringing in new challenges and opportunities that need a fresh perspective.

    Digital currencies from central banks are gaining momentum

    In 2024, big changes are happening in fintech, particularly with cryptocurrencies. Evolving past speculation they’re becoming a big part of regular financial systems, shaking up the way of doing things in finance.

    At the same time, businesses are getting on board with using cryptocurrencies for everyday transactions. This shift is blurring the lines between old-school finance and the new digital finance wave, making global financial systems more flexible and connected.

    Countries like China, Sweden, South Korea, the US, and the European Union are taking the lead in exploring and possibly launching Central Bank Digital Currencies (CBDCs). The goal? To make transactions cheaper, include more people in the financial system, and revolutionise how we make payments across borders.


    Market size of central bank digital currency (CBDC) worldwide in 2023, with a forecast for 2030, Statistica

    For fintech companies, this is a golden opportunity to be the go-to partner for those navigating these changes. They can use innovative solutions, especially in infrastructure, security, custody, data management, market analytics, and transaction monitoring.

    Jumping back to 2023, a turning point for CBDCs. Many countries tested them out, and some are already using them. The European Central Bank is gearing up for its digital euro project after two years of digging into it. According to a study by Juniper Research, we’re looking at a whopping $213 billion processed through CBDCs by 2030 , showing how much they’re set to grow. Looking specifically at 2024, Juniper further predicts more specific uses, like cross-border payments and business transactions.

    Why the focus on CBDCs? They offer a stable and reliable digital currency, especially when you compare them to the crazy ups and downs of other cryptocurrencies. So for CTOs navigating these changes, it’s time to get your companies ready to ride the wave of digital currencies.

    The growing trend of embedded finance

    We’ve mentioned the increasing popularity of Central bank digital currencies (CBDCs). At the same time, embedded finance is transforming how financial services are integrated into non-financial platforms.

    This means users can access banking, payments, and other financial features without leaving the apps they’re using. For example, services like Buy Now Pay Later (BNPL) are gaining traction among younger consumers, even though they come with risks like debt accumulation. Despite these challenges, BNPL transactions are expected to grow significantly from $120 billion in 2021 to $576 billion by 2026.

    Embedded finance is spreading beyond traditional sectors, with banking services now available in e-commerce and ride-hailing apps. Even companies like Tesla are getting in on the action by offering insurance with their car purchases. However, as we approach 2024, the convenience of embedded finance also brings challenges, especially concerning data privacy and security.

    Despite this, the combination of central bank digital currencies and embedded finance is undeniably reshaping the current landscape. The rise of digital currencies reflects a shift in how transactions are done, while embedded finance is changing how users interact with financial services. This presents both opportunities and challenges for businesses, and as CTOs, understanding and adapting to these trends will be crucial for staying ahead in fintech innovation.

    Decentralised finance becomes more popular

    This surge of digital currencies and central banks is particularly driven by the maturation and expansion of decentralised finance (DeFi).

    DeFi is in a transformative phase, refining lending and borrowing protocols while bringing innovative features to decentralised exchanges.  The shift towards DeFi is part of a broader effort to democratise financial services, offering an enriched array of choices beyond conventional banking. As we enter this new era of financial engagement, the collaboration between fintech and DeFi projects is noteworthy. Fintech companies, recognising the potential impact, are contributing expertise and resources to make DeFi more accessible, secure, and user-friendly.
    On the quantitative front, Grand View Research valued the decentralised finance market at 13.61 billion USD in 2022, with projections indicating a substantial revenue increase to 231.19 billion USD by 2030 . These numbers underscore the exponential growth and significance of DeFi in reshaping the financial landscape.


    Decentralised Finance Market (in USD)

    2024 marks a pivotal moment where CTOs can leverage technological expertise to navigate the complexities of central bank digital currencies and the expanding DeFi ecosystem. Embracing these changes strategically can position organisations at the forefront of fintech, fostering innovation and resilience in rapid transformation.

    Personalised financial services using AI and machine learning

    The integration of artificial intelligence (AI) and machine learning (ML) is redefining financial services. Artificial intelligence is already rapidly advancing, with innovations like ChatGPT, DALL-E, and Midjourney transforming how businesses approach technology. AI and ML are poised to disrupt various sectors this year, particularly fintech, leading to notable developments in how financial technology is advanced and adopted. A notable study by The Economist reveals that 52% of traditional banks are already leveraging both artificial intelligence and machine learning for various business functions, as shown below:


    The Economist Unit Survey

    AI is predominantly employed by banks for fraud detection, with 58% heavily relying on it and an additional 32% using it to some extent. Similarly, in optimising IT operations, 54% utilise AI extensively, while 36% employ it to some extent.

    Virtually all banks currently incorporate AI to some degree or have plans to do so within the next three years, spanning various business domains such as operations and customer experience. The upcoming areas for substantial growth encompass personalised investments, with 17% planning adoption in the next 1-3 years, followed by credit scoring (15%) and portfolio optimisation (13%).

    Customised services that are set to further impact financial behaviours will include:

    Managing risk

    AI tools enable businesses to analyse and improve regulatory approaches, shifting from reactive to proactive risk management.

    Enhancing Customer Experience:

    Fintech leveraging AI for customer experience gains a competitive edge by offering personalised financial services, AI-driven automation, chatbots, and virtual assistants to anticipate user needs, providing real-time, tailored support, and financial guidance.

    Automation in Fintech:

    AI-driven automation in Fintech goes beyond routine tasks, handling complex decision-making flows like loan approvals, improving efficiency, and reducing operational costs.

    AI and Blockchain Synergy:

    Integration of AI with blockchain enhances the security, transparency, and scalability of financial transactions, particularly in smart contracts.

    Transforming Payments with AI:

    AI-powered payment solutions offer faster, more secure transactions, with machine learning analysing user behaviour for personalised experiences and biometric authentication.

    Modernising Traditional Financial Services with AI:

    AI applications in traditional financial services include customer service automation, fraud detection, and personalised portfolio management.

    The upcoming year is set to play a crucial role in the continuous advancement and incorporation of technologies driven by artificial intelligence. Fintech firms, although capable of enhancing efficiency and capabilities, must collectively prioritise values such as transparency, fairness, and user-centricity.

    Improved security and authentication

    In 2024, biometric authentication is reshaping how we secure finances. It uses unique traits like fingerprints and faces, along with newer methods like voice recognition. This approach simplifies user interactions, doing away with traditional PINs and passwords.

    Alongside this, identity trends for the year include improving Single Sign-On (SSO) with added security features and the rise of decentralised identity systems using blockchain. Cloud-based Identity-as-a-Service (IDaaS) is also growing for scalable and cost-effective solutions.

    Guidance from regulations is essential. Striking a balance between convenience and privacy, robust data protection is also crucial. In summary, 2024 is a key time for financial security, making transactions safer and smoother. CTOs should navigate these changes, embracing advancements while handling regulatory challenges.

    To conclude

    In the ever-evolving landscape of fintech, the priorities for CTOs are not static but fluid, responding to dynamic shifts in technology and consumer behaviour. The challenges and opportunities of 2024 underscore the need for CTOs to exhibit adaptability and strategic foresight.

    If you want to start a conversation about engaging us for your fintech project or talk about partnering and collaboration opportunities, don’t hesitate to contact the Erlang Solutions team. We seamlessly prototype, build, monitor, and maintain mission-critical solutions for payment systems, digital lending, clearing and settlement services, and more. Trusted by industry leaders like Klarna, Vocalink (Mastercard), Visa, Danske Bank, and Safaricom, our consultative approach, combined with our teams expertise, ensures that businesses can confidently direct their resources toward strategic goals and growth.

    The post 5 Key Tech Priorities for Fintech Leaders in 2024 appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/5-key-tech-priorities-for-fintech-leaders-in-2024/

    • chevron_right

      Ignite Realtime Blog: Creating the XMPP Network Graph

      news.movim.eu / PlanetJabber · Wednesday, 24 January - 19:31 · 10 minutes

    At the risk of sounding like an unhinged fanboy: XMPP is pretty awesome !

    I’ve been involved in one way or another with XMPP, the network protocol that is an open standard for messaging and presence, for the last two decades. Much of that revolves around development of Openfire , our XMPP-based real-time communications server.


    Decentralisation with XMPP

    There’s much to say about what I like about XMPP, but let me focus on one thing in this text: decentralisation. Not only the protocol, but the entire XMPP ecosystem - with all its different server and client implementations - is based on the principle of federation. This allows anyone to set up their own solution for instant messaging, voice and/or video conferencing, data sharing, and much, much more. All of this is done without creating a dependency on any central entity or organisation. At the same time, you’ve created a solution that allows you to communicate with others that are not part of your own domain.

    Some of the benefits of decentralisation are obvious: you get to control your own data. When you’re not sharing data with one monolithic IM solution provider, then there’s a lot less to worry about with regards to their privacy policies, marketing strategies, cookie policies and data security.

    Another benefit of using a decentralised approach is diversity. I know of at least seven or eight different XMPP server implementations - all of which are mature projects that have a proven track-record of interoperability for many, many years. Each of the server implementations have their own strengths. Some favour raw network performance, others offer a more complete social media feature set. Some focus on being active in the Small and Medium-sized Enterprises segment, others try to cater to family & friends type of communities. Some are open source, others are commercial products. There are products that offer a turn-key, no-config-needed full blown instant messaging solution, while others can act as a development platform that is useful when you’re looking to develop your own networking application. As you might be able to tell from all this, diversity gives you the option to select the best software suite for your needs.

    I digress. XMPP federation is based on server-to-server connections. Whenever a user on one domain starts to interact with a user on another domain, servers of both domains will connect to each other in the background to facilitate this interaction. As you might imagine, when enough users start to interact with each other, this leads to interesting webs of interconnected domains.

    Reviving an old idea: creating a network graph!

    Over the last holiday season, I remembered an old, now defunct project by Thomas Leister, that generated a visual representation of interconnected servers. Its visuals were pretty amazing. I remember Tom’s solution to be based on an out-of-band exchange of data (through a simple webservice), and recalled his desire to replace this with a solution that used XMPP’s own protocol and functionality. His stated goal was to use XMPP’s Publish/Subscribe functionality , but never seemed to have been able to get around to implementing that. I had some spare time over the holidays and challenged myself to build this new approach. I started work on a new version of that project, aiming to build a web application that renders a semi-live network graph of XMPP domains with their connections to other XMPP domains.

    The path from prototype to a fully working solution was an interesting one, involving a couple of different aspects of development within Openfire, but also the XMPP community as a whole.

    Using Openfire as a development platform

    Perhaps unsurprisingly, but I love working with Openfire. It’s so incredibly versatile when it comes to adding new features and functionality.

    For this new project, I needed a couple things.

    1. An API to add functionality to Openfire. Check. Openfire’s Plugin API gives you structured access to pretty much everything in Openfire. It’s easy to use, yet very versatile.
    2. A web server. Check. Openfire ships with Eclipse Jetty , an embedded web server. It’s used for Openfire’s Administration Console, but can just as easily be instantiated to serve different content. Openfire’s BOSH connectivity makes use of this, but also plugins like the Random Avatar Generator and Presence Service that expose user-specific, as well as the inVerse and JSXC plugins, that each serve a full-fledged web based XMPP client.
    3. A Publish-Subscribe service. Check. Openfire out of the box implements XMPP’s pub-sub standard.
    4. Database storage. Check. Openfire ships with support for most popular relational databases. Crucially, it allows a plugin to define and manage its own database structures.
    5. A javascript graphing engine. From a quick search, various applicable projects popped up. I opted to go with vis.js , for no other reason than that it was the first thing that popped up, looked reasonably mature and had easy-to-follow documentation. I later added VivaGraph , which offers WebGL support. Turns out that if you render thousands of nodes in a network, CPUs tend to get busy. Who knew? WebGL helped make things more efficient.
    6. Basic HTML and CSS design skills. :grimacing: I am many things, a good designer is not one of them.

    My first prototype wrapped all of this into a solution that:

    • Periodically iterated over all server-to-server connections
    • Stored all information in a simple data structure
    • Persisted the data structure in the database
    • Created a web service that exposes the data as ‘nodes’ and ‘edges’ to be consumed by the graphing software
    • Have a simple, static webpage that consumes that webservice, and renders the graph using the third-party graphing engine.

    In all, I was pretty proud to have been able to write all this in a single evening!

    The approach above gave me a nice hub-and-spoke graph, where my server was the hub, showing spokes to every connected remote domain.

    To be able to install this on more than one domain, I separated the plugin into two:

    1. One plugin that aggregates the connectivity data, to be installed on all servers on the network
    2. Another one that generates the website installed only on the server that acts as the public interface to the website.

    I’ve used the XMPP’s Publish-Subscribe feature to bridge the gap between the two plugins. After some quick modifications, the first plugin creates a pub-sub node on the local pub-sub service, to which the second plugin subscribes. The second plugin then aggregates all of the data in its database, and uses that to populate the webservice, just as before.

    Using this mechanism, it is pretty straight-forward to have many servers feeding one website. With a bit more work, I was even able to write a quick crawler, that tries to find pub-sub nodes with similar information on all of the XMPP domains that are reported as being remotely-linked domains, which removed the need to have every server sign up to the website manually.

    Finally, I paid a hoster a little bit of extra money to have a new server to host a new Openfire server that would act as the public website, going through the motions of having a domain name and corresponding TLS certificate. Having done this before, I automated most of that, allowing me to create a new Openfire server from scratch in about ten minutes. I manually installed the new plugin, installed a reverse proxy to serve web content on standard ports, and, presto! The XMPP Network Graph suddenly became a publicly available service!

    Some of the community members at IgniteRealtime.org were happy to install my plugin, which quickly contributed to the network graph growing.

    Working with the XMPP community

    To be able to grow the XMPP network graph, it is desirable to have support added to more server implementations than just the one for Openfire. As luck would have it, the XMPP ecosystem, as stated above, thrives on diversity.

    To allow for a great deal of extensibility and flexibility, and to optimise interoperability, the XMPP Standards Foundation manages a pretty nifty process for extending the XMPP, through documents aptly named XMPP Extension Protocols (XEPs). The full process is documented in the very first XEP-0001 . Have a read, if you’re interested.

    The standardised way to get XMPP entities to interoperate on a bit of functionality is simple:

    • Write a XEP to document the functionality
    • Submit the XEP to the XSF for review and publication
    • Incentivise others the adopt the XEP

    I did just that , and found the added value of this process to be unexpectedly high.

    A submitted XEP makes for a convenient discussion subject. My original document quickly drew feedback .

    Although I was aware of Thomas’ implementation, others apparently also toyed with creating network graphs of the XMPP network. Seems that I’m even further from having had an original idea than what I expected.

    The feedback from the XMPP community showed the expertise and experience that lies within that community. Several technical issues were discussed, which led to improvements of the protocol. Probably the most important bit of feedback that was given related to privacy concerns, which we discussed at length.

    The XMPP ecosystem consists of servers of all sizes. There are various XMPP service providers that each have many thousands of users. There are also plenty of servers that are running for family and friends, or even a single user. It is these servers that were the subject of the privacy concern.

    If a connection is visible between two of these small servers, it becomes reasonably evident that two specific individuals are communicating with each-other. If both individuals agree to have this information published, then there’s no privacy concern - but what if only one individual does so? If John makes public that they’re connecting to Jane, then the fact that Jane is communicating with John is implicitly made public too. If other friends of Jane (Jack, Jill and Johan) similarly publish all their connections, then determining who Jane’s friends are becomes pretty straightforward - without Jane having consented to any data publication.

    This, rightly, got flagged in early feedback from XSF members. We’ve discussed the impact of this problem, the need to address it, and various strategies to resolve the issue. We ended up with a solution that allows any server to publicise their connections, but require them to automatically verify that their peer opts-in to having a connection to their server be identifiable (those that do not show up as anonymous blips in the graph).

    Based on the feedback, this and other improvements were quickly made to the XEP and my Openfire implementation. Now that there was a stable-ish protocol definition, it became easy for compatible implementations to be created for other XMPP servers. To date, there are implementations for Tigase , Prosody and ejabberd - and there’s mine for Openfire , of course. Not a bad score, after only a few weeks of development!

    Wrapping up.

    My XMPP Network Graph project has been maturing nicely in the last few weeks, as you can see from the screenshot above. You can have a look at and interact with the network graph at xmppnetwork.goodbytes.im . At the time of writing, it contains over 6,600 domains. It is pretty powerful to see how many people are interacting over XMPP, and that only in the small part of the network that is being mapped by the graph!

    You can now add your own XMPP server to the graph! The plugin that I created for Openfire can be found here . Plugins or modules are available for other XMPP servers too. Have a look at the FAQ section of the XMPP Network Graph for instructions on how to add your server to the network graph!

    I’ve enjoyed the process of setting all this up. Having most of the development pieces already in place, as mentioned above, allowed for rapid development. To me this is a testament to the power of not only Openfire as a development platform but also XMPP as the versatile Swiss Army knife of network protocols.

    I’d love to learn what you make of this! Do you have success stories of your own to share? I’d like to hear from you!

    For other release announcements and news follow us on Mastodon or X

    1 post - 1 participant

    Read full topic

    • chevron_right

      Ignite Realtime Blog: HTTP File Upload plugin 1.4.0 released

      news.movim.eu / PlanetJabber · Friday, 19 January - 16:15

    The HTTP File Upload plugin is a plugin for Openfire that allow users to easily share files (such as pictures) in a chat

    A new release is now available for this plugin: version 1.4.0.

    This release introduces a couple of interesting security improvements:

    • an additional guard has been added that should prevent scripts embedded in data to be executed without the recipient’s approval
    • It is now possible to configure a virus scanner that will process all uploaded content.

    Configuration details are available on the plugin’s archive page , and in its readme file.

    The update should be visible in the Plugins section of your Openfire admin console within the next few days. You can also download it from the plugin’s archive page .

    For other release announcements and news follow us on Mastodon or X

    1 post - 1 participant

    Read full topic

    • chevron_right

      Ignite Realtime Blog: Presence Service plugin v1.7.2 release

      news.movim.eu / PlanetJabber · Friday, 19 January - 14:53

    The Presence Service plugin is a plugin for Openfire. It provides a service that provides simple presence information over HTTP. It can be used to display an online status icon for a user or component on a web page or to poll for presence information from a web service.

    A new release is now available for this plugin: version 1.7.2.

    In this release, an incompatibility with the recently released Openfire 4.8.0 was fixed. Also, a reportedly infrequent issue with loading images has been addressed.

    The update should be visible in the Plugins section of your Openfire admin console within the next few days. You can also download it from the plugin’s archive page .

    For other release announcements and news follow us on Mastodon or X

    1 post - 1 participant

    Read full topic