August 31, 2016

Kexi 2 contains a map plugin powered by Marble.

With the port to Qt5 ongoing, this plugin was disabled, now this have changed, and with an hours porting, the KReport library has regained support for map items within a report, putting it on-par with Kexi 2 (bugs and all!)

The proof is in paper format, and this is a shot of a database created in Kexi 2:

The second item is not displaying, IIRC this also happened in Kexi 2, so some debugging is required!

This should land in KReport master soon, so if you need a report library, you know where to look ;)

I’m going to QtCon/Akademy tomorrow. I will be in Berlin until 7th afternoon. It’s going to be my first Akademy (and first time in Berlin as well), so I’m really looking forward to the trip and to meet interesting people. This is breafly what I’m planning to do:

  • I’ll try to submit my first patch to Qt, to fix an annoying bug with the Plasma native file dialog.
  • I’ll try to release a first beta of KIO GDrive.
  • I’m planning to merge the Vladyslav Batyrenko’s GSoC project in Ark master branch.
  • BoF sessions I’m interested in: at least Flatpak and KApidox.

See you in Berlin!

Poco a poco se van subiendo el material audiovisual de la última reunión de simpatizantes KDE de España. Así me complace compartir con vosotros los vídeos de la tarde del sábado de Akademy-es 2016 de Madrid, la jornada más intensa con charlas relámpagos, ponencias técnicas, alguna sorpresa, al foto de grupo y la invitación a formar parte de la asociación KDE España.

Vídeos de la tarde del sábado de Akademy-es 2016

Vídeos de la tarde del sábado de Akademy-es 2016 de Madrid

Charlas replámpago

Después de unas buenas pizzas y tortillas empezó la jornada vespertina de Akademy-es 2016 con las charlas relámpago, pequeñas presentaciones de unos diez minutos donde se presentan pequeños (o grandes) proyectos, dando una pincelada sobre los mismos.

Es este caso tuvimos la suerte de contar con:

Open365, llevando aplicaciones de escritorio al cloud por Alex Fiestas

Alex Fiestas nos explicó su trabajo en eyeOS, su desarrollo, sus problemas hasta llegar a lanzar Open365, un proyecto que puede competir con office365 y Google Docs y que ya está liberado.

Valgrind y ASAN: Dos formas de instrumentar código en busca de errores por Albert Astals Cid

Albert Astals nos trajo un tema muy técnico: Valgrind y ASAN, unas herramientas básicas para todos los desarrolladores que sirven para buscar errores. Albert nos explicó su funcionamiento, sus ventajas y desventajas.

Charla relámpago sobre CoderDojo

La sorpresa de la tarde: uno de los asistentes nos quiso hablar de un proyecto que pretende acercar la programación a los más pequeños: Codedojo.

Clazy: mejorando tu código en tiempo de compilación por Albert Astals Cid

Albert Astals subió al estrado para presentarnos un nuevo proyecto de KDE llamado Clazy que ayudará a los programadores a mejorar su código, y por tanto, las aplicaciones. Como no podía ser de otra forma, nos ofreció una charla muy técnica.

La Potencia de la Terminal por Erik Poveda Diaz

La penúltima charla de la tarde, Erik Poveda nos realizó un recorrido històrico sobre la Potencia de la Terminal, que como fue algo corta se ha unido a…


KDE España: Qué es y para qué sirve por José Millán

Para finalizar la jornada tuvimos la típica y obligada conferencia sobre KDE España a cargo de José Millán que repasó la formación de la asociación, sus fundamentos y objetivos, sin olvidar el funcionamiento básico y los requisitos para formar parte de ella.

August 30, 2016

I'm coming back home on Tuesday afternoon-ish so if you want to catch me for something be sure to make it before then :)

See you in Berlin!

Wow ! What a summer !

Roughly 12 weeks have passed since I started working on KDE Now (well formally). Results came out some couple of hours ago and I passed the full term evaluation. In this post, I’m going to talk about what I’m planning for the future of KDE Now. How the past weeks have been and some other stuff.


To start with, the summer was awesome. I got to learn so much and that certainly made me a better programmer. Before this summer, I had worked on some parts of large codebases but never really started one from scratch. KDE Now taught me a lot about all the things you have to face when starting a project. The design choices you have to make. Even down to the bottom most nitty gritty details like using const correctness or avoiding copy of data while function calls. All these tiny bits eventually contribute to a better, working, cohesive whole.


As for my work, you can find the formal work report here. KDE Now now has a quickgit repository . It’s currently lying in playground and will continue to do so for a couple of months until I have everything sorted out before a possible release. First off, I’m gonna work on documentation. This is to attract other developers to contribute to it. Who wants to work on an undocumented software anyway😛 . Next up, I will write unit tests. I have tested the application but only as a whole in some different scenarios. Unit Testing would be a lot better. But before all that can happen, I want a break. Haha ! I’ve been working for so long on this, that other things took a lower priority. School has started for much long and a lot of things require my attention. So I think, I should take a month off. You can hear me, with other updates from the start of October again. Meanwhile if you have any doubts/questions or want to contribute to KDE Now, I’ll be glad to to answer everything. Just comment below or drop me a mail🙂


I am very thankful to my mentor, Ashish Bansal. He is simply an awesome mentor. He would point me to some direction and leave the rest to me for finding out on my own. He would tell me what design choices were better, their trade offs but then again, left it to me to choose on my own. We talked on every other (or two) day. There were some times when I wasn’t very productive but he was pretty cool with that too, as long as I was not terribly behind schedule. I even poked him during his vacation, and he still managed to reply ! I’m really thankful to him. I couldn’t have asked for a better mentor. Also I am thankful to KDE for giving me this opportunity and finally to Google for organizing this awesome program.


Cheers !

See you later.


Last week we updated the Qt Forum to the latest version of NodeBB.

We had been planning the upgrade for a while, but had to do the upgrade on a quick notice, as a bug that leaked user emails was found in the forum. Thanks to Justin Clift for pointing out the issue to us!

This means that it was possible for someone to find out user emails from the forum. For those users who have their email as public, this is not a issue, but some of you want to keep your email to yourself. The bug meant that these email addresses could also be found.

No other data was available through the bug, and as we are using a central sign in service, no account information could leak from the forum.

So if you have gotten more email spam than normally this might be one cause.

We are sorry for the leak, but in our defence, we did not know of it, and patched the system in under a day of becoming aware of the issue.

But on to the upgrade itself.

With the upgrade we changed to the new default theme used by NodeBB. It looks quite different from the old theme, and has already gotten some for and against feedback. I personally am getting used to the look and feel, and after the initial shock, I like it. That’s a personal opinion, your mileage may vary, and please do tell us in the comments.

Due to the rushed upgrade some small things still need tweaking, the colours are a bit off from the Qt green, that will be fixed as soon as I find the time for it.

The reasoning for updating the theme, is that we can now follow the NodeBB upgrades faster, as we do not need to customise the theme as much as before. This will bring the improvement faster to you.

As an example new feature we now have chat rooms instead of one-to-one chats on the forum. To create a room, you can start a chat, and from the chat window settings add other users. At least for the Forum regulars this is quite an improvement.

So what do you think of the new Qt Forum look? Please tell us in the comments or drop by the forum to share your opinion.

Updated to credit Justin for finding the leak, thanks again!

The post New Forum theme and security notice appeared first on Qt Blog.

Last week, I spent 4 days at the Krita Sprint in Deventer, where several contributors gathered to discuss the current hot topics, draw and hack together.

You can read a global report of the event on news.

On my side, besides meeting old and new friends, and discussing animation, brushes and vector stuff, I made three commits:
-replace some duplicate icons by aliases in qrc files
-update the default workspaces
-add a new “Eraser Switch Opacity” feature (this one is on a separate branch for now)

I also filed new tasks on phabricator for two feature requests to improve some color and animation workflow:

Once again, I feel it’s been a great and productive meeting for everyone. A lot of cool things are ready for next Krita version, this is exciting! So much thanks to KDE e.V. for the travel support, and to the Krita foundation for hosting the event and providing accomodation and food.


LWN reports on the sad death of Vernon Adams, designer of the Oxygen font and author of the invaluable how to use Font Forge guide.

VDG Artist Thomas Pfeiffer writes:

The name Vernon Adams might not ring any bells for you, but if you have used Plasma in the recent past, you know at least one of his works: The Oxygen font, which was Plasma's default user interface font for a long time.

Vernon did excellent work on the font, and we'd still be using it as our default today if a tragic car accident had not rendered him unable to continue his work (sadly nobody else took it up, either). Sadly, Vernon has now passed away.

Vernon Adams will always be remembered by the Free Software community for his tireless work for freedom in font design, and we hope he will inspire countless font designers to come.

On Friday QtCon starts and there will be of course an update about the current state of Wayland support in Plasma. See you during the lightning talk session on Friday between 17:30 and 18:30 for my lightning talk “We are in Wayland!

La noticia es algo vieja, algo que empieza a ser costumbre en el blog en un final de mes de agosto inusualmente cargado de noticias frescas. Así que hoy me toca anunciar algo que supongo que todo el mundo conocerá, y es que a mediados del mes un bombazo sacudió la Comunidad cuando se anunció que Canonical sería un nuevo patrocinador de KDE. Y, para completar el artículo, compartiré con vosotros mi reflexión personal, desde el punto de vista de un usuario que no conoce todos los entresijos del mundillo de Software Libre.

Canonical nuevo patrocinador de KDE e.V.

Canonical nuevo patrocinador de KDEPara los que vivan en una cueva o han estado de vacaciones quizás sea una noticia que les sorprenda: Canonical se convirtió en patrocinador de KDE a mediados del mes de agosto. Este anunció sorprendió a propios y extraños por la irregular relación que tiene Canonical con el proyecto KDE.

No olvidemos que Kubuntu siempre ha sido una distribución extraña dentro de la familia xUbuntu, que esta distribución pasó de ser “oficial” a pertenecer enteramente a la Comunidad hace unos años y que recientemente parte de su equipo abandonó el proyecto y creó KDE Neon.

Con todos estos antecedentes no parecía que la relación Canonical-KDE pasara por un buen momento, pero la vida nos da sorpresas y el pasado 18 de agosto en KDE.News, canal de comunicación oficial de KDE, se nos anunció que Canonical se convertía en un nuevo patrocinador del proyecto.

En términos prácticos que significa esto: que Canonical aportará 10000 € anuales a KDE para promocionar el desarrollo de Software Libre a cambio de aparecer como patrocinador de KDE en la web (y otros medios) y potenciar la adopción de “paquetes Snap en los espacios de trabajo y aplicaciones de KDE para que sean fácilmente instalables por usuarios de cualquier distribución Linux”, en palabras de Michael Hall, Community Manager de Ubuntu.

Por otra parte, la gente de KDE e.V. valora muy positivamente la noticia, y en palabras de Aleix Pol, vicepresidente de la fundación, “es importante hacer continuos esfuerzos juntos, ya que es la mejor manera de seguir ofreciendo una plataforma de desarrollo libre y abierta, y confía en que la colaboración sea beneficiosa para toda la comunidad y el ecosistema de GNU/Linux.”


Más información: KDE.News | Genbeta

Mi reflexión personal

Hay gente que se ha echado las manos a la cabeza pero en realidad no debería sorprendernos que una empresa patrocine proyectos de Software Libre pensando que de esta forma se pierde independencia. En mi opinión no creo que esto ocurra. Sin ir más lejos KDE está patrocinada por BlueSystems, Google, Suse y The Qt Company, con lo que Canonical no hace más que sumarse de forma visible al proyecto KDE, y hasta la fecha, que yo sepa, no se ha visto influido por decisiones empresariales.

Bajo mi punto de visto, siempre que los dirigentes de los proyectos tengan claro que su trabajo es por y para la Comunidad no debería haber ningún problema. Ahora bien, como amantes del proyecto saben que para que éste siga creciendo hace falta financiación: se necesita mantener unas estructuras físicas como servidores, se necesita potenciar las relaciones humanas con los eventos, se necesita tener toda la burocracia en regla, se necesita material para estar presente en otros eventos  y, si es posible, tener gente contratada a tiempo completo trabajando por el proyecto.


Todo esto cuesta dinero y se debe sacar de algún sitio. A mi parecer en la actualidad solo existen dos formas viables para conseguir fondos: donaciones de la Comunidad (como las campañas para los Sprints de Randa o las recaudaciones de Krita) o patrocinios de empresas. A mi entender existen dos más: publicidad, la cual no es nada deseable, o apoyo de instituciones públicas (gobiernos, ayuntamientos, universidades, etc), una vía que en mi opinión debería estar más explotada y que quizás estamos desaprovechando.

Evidentemente, una vía de financiación no excluye a la otras, así que si hay empresas que quieran patrocinan KDE y éstas tienen vínculos conocidos con el Software Libre no veo ningún inconveniente.Y es que compañías de Software quieran unir su nombre a KDE no es nada malo, solo un síntoma que el proyecto KDE tiene prestigio, destila calidad y tiene mucho futuro.

Ahora bien, tampoco sería conveniente que éste se convirtiera en la única vía. Los proyectos comunitarios son un reflejo de su Comunidad, así que la participación de la Comunidad es extraordinariamente importante. Los integrantes de la Comunidad KDE deben tener claro que formar parte de una Comunidad no es recibir todo sin dar nada a cambio, y el “pago” que se les pide es la participación en todos los aspectos del Proyecto que puedan: participar en las donaciones, escribir código, promocionar el proyecto, etc.

20 formas de colaborar con KDE

En definitiva, creo que está bien que se sumen empresas al patrocinio siempre y cuando no dicten las líneas de desarrollo del proyecto. No obstante creo también que para que esto no acabe ocurriendo KDE debe tener una Comunidad implicada en el proyecto que sea activa, participativa, dinámica y emprendedora.


So, yeah. I failed the final evaluations of Google Summer of Code. I didn't expect it, but I think I should have added more features to my project and worked harder. To be honest, this was one hell of a summer. Working on such a project, OfflineExtension for WikiEditor, with such an awesome community WikiToLearn and such cool mentors, Gianluca Rigoletti and Irene Cortinovis, is worth millions. I couldn't pass the final evaluations, but I gained so so much that I have no hard feelings. Of course, I felt bad, but at the end, the experience is all that matters. Thanks, Google and WikiToLearn for giving me such an amazing opportunity. I would work more on my project and try to get it in the production mode, as soon as possible. Also, I will reapply for Google Summer of Code next year too with the same organization, and will definitely pass :'). And, the bad part ends here. Good news is that I'm now a core developer of WikiToLearn, and I find this opportunity to be more worthy than being a GSoCer, so this.failure eventually equals to this.success (Sorry for this, I'm learning JAVA :P ). Thanks, Riccardo and Gianluca for giving me this oppurtunity. I’ll give my best, work harder, learn and will contribute to  as much as I can. :)

Long Live WikiToLearn, Long Live KDE, Long Live Mediawiki and Long Live FOSSatAmrita! :)

August 29, 2016

That’s it. After a combined total of 217 git commits, 6,202 lines of code added, and 4,167 lines of code deleted, GSoC 2016 is finally over.

These twelve weeks of programming have been a very enriching experience for me and making this project has taught me a lot about production quality software development. Little did I know that a small project I had put together in a 6 hour session of messing around with Qt would lead to something as big as this!

There have been many memorable moments throughout my coding period for the GSoC - such as the first time I got an ioslave to install correctly, to writing its “Hello World” equivalent, and getting a basic implementation of the project up and running by doing a series of dirty hacks with Dolphin’s code. There were also times when I was so frustrated with debugging for this project, that I wanted to do nothing but smash my laptop’s progressively failing display panel with the largest hammer I could find. The great mentorship from my GSoC mentor and the premise of the GSoC program itself kept me going. This also taught me an important lesson with regards to software development - no one starting out gets it right on their first try. It feels like after a long run of not quite getting the results I wanted, the GSoC is the thing which worked out for me as everything just fell into place.

There’s a technical digression here, which you can feel free to skip through if you don’t want to get into the details of the project.

Following up from the previous blog post, with the core features of the application complete, I had moved on to unit testing my project. For this project, unit testing involved writing test cases for each and every component of the application to find and fix any bugs. Despite the innocuous name, unit testing this project was a much bigger challenge than I expected. As for one thing, the ioslave in my project is merely a controller in the MVC system of the virtual Stash File System, the Dolphin file manager, and the KIO Slave itself. Besides, most of the ioslave’s functions have a void return type, so feeding the slave’s functions’ arguments to get an output for checking was not an option either.

This led me to use an approach, which my mentor aptly called “black box testing”.

In this approach, one writes unit tests testing for a specific action and then checking for whether the effects of the said action are as expected. In this case, the ioslave was tested by giving it a test file and then apply some of the ioslave’s functions such as copy, rename, delete, and stat. From there, through a bunch of QVERIFY calls is to check whether the ioslave has completed the operation successfully. Needless to say, this approach is far more convoluted to write unit tests for as it required checking each and every test file for its properties in every test case. Fortunately, the QTestLib API is pretty well documented so it wasn’t difficult to get started with writing unit tests. I also had a template of what a good test suite should look like thanks to David Faure’s excellent work on implementing automated unit testing for the Trash ioslave. With these two tools in hand, I started off with writing unit tests shortly before the second year of college started.

As expected, writing black box unit tests was a PITA in its own right. The first time I ran my unit test I came up with a dismal score of 6 unit tests passed out of the 17 I had written. This lead me to go back and check whether my unit tests were testing correctly at all. It turned out that I had made so many mistakes with writing the unit tests that an entire rewrite of the test suite wasn’t unwarranted.

With a rewrite of the test cases completed, I ran the test suite again. The results were a bit better - 13 out of the 17 test cases passed, but 4 failed test cases - enough reason for the project to be unshippable. Looking into the issue a bit deeper, I found out that all the D-Bus calls to my ioslave for copy and move operations were not working correctly! Given that I had spent so much time on making sure the ioslave was robust enough, this was a mixed surprise. Finally, after a week of rewriting and to an extent, refactoring the rename and copy functions of the ioslave, I got the best terminal output I ever wanted from this project.

Definitely the highest point of the GSoC for me. From there on out, it was a matter of putting the code on the slow burner for cleaning up any leftover debug statements and for writing documentation for other obscure sections. With a net total of nearly 2000 lines of code, it far surpasses any other project I’ve done in terms of size and quality of code written.

At some points in the project, I felt that the stipend was far too generous, for many people working on KDE voluntarily produce projects much larger thann mine. In the end, I feel the best way to repay the generosity is to continue with open source development - just as the GSoC intended. Prior to the GSoC, open source was simply an interesting concept to me, but contributing a couple of thousands of lines of code to an open source codebase has made me realise just how powerful open source is. There were no restrictions on when I had to work (usually my productivity was at its peak at outlandish late night hours), on the machine I used for coding (a trusty IdeaPad, replaced with a much nicer ThinkPad), or on the place where I felt most comfortable coding from (a toss up between my much used study table or the living room). In many ways, working from home was probably the best environment I could ask for when it came to working on this project. Hacking on an open source project gave me a sense of gratification solving a problem in competitive programming never could have.

The Google Summer of Code may be over, but my journey with open source development has just begun. Here’s to even bigger and better projects in the future!

Kdenlive logo
Kdenlive 16.08.0 was released a few weeks ago and we are now preparing the bugfix 16.08.1 release that will be tagged on the 5th of September.
In the recent weeks I have not been able to spend much time on Kdenlive. Hopefully, in a few days I will be in Berlin for Akademy, which is always a great opportunity to move forwards. For everyone interested in discovering Kdenlive, discussing about your successes or problems with it, or if you want to get involved, feel free to join the Kdenlive’s BoF session on Monday, 5th of September.

Jean-Baptiste Mardelle

The artists, developers, website maintainers and documentation writers who were in Deventer for the Krita sprint  since Thursday are now slowly returning home. Some will stay for a bit longer, others had to leave on Sunday already.


There is something extremely exhausting and exhilarating about a real-live meeting like this! Lots and lots of topics were discussed:

  • We discussed, are actually still discussing, the user interaction design for the vector and text projects this year’s kickstarter funded.
  • Lots of work was done to support OSX properly — the opengl patch looks ready to land in Qt! There were also some fixes to tablet handling.
  • We had all three Google Summer of Code students present! Wolthera finished the second part of her project (the first was soft proofing and is in 3.0.1): a new color selector internal to Krita that is fully color managed. Jouni presented his animation work and Julian his work on Qt’s OpenGL QPainter engine.



  • We discussed the publication of a Pepper and Carrot book by the Krita Foundation with David Revoy.
Pepper loves Kiki!

Pepper loves Kiki!

  • We refined the release process and the process by which we take feature requests all the way to implemented and released features.
  • Jouni sat together with Steven, the author of this post’s sketches and who is also an accomplished animator to go through Krita’s animation workflow
  • We discussed how badly we need a new architecture and UX design for resource management
  • We made plans for improving the website and the webshop
  • Dmitry showed a new brush engine (alcohol markers) that can handle enormous brush diameters — 2500 pixels isn’t impossible
  • We fixed bugs, bugs and bugs….

Kiki_Angel &Demon

  • And finally, we had a great time. Many dedicated contributors to Krita had never met in person before, and now we’ve got faces and voices mapped to chat channel nicknames and commit message email addresses!

The 2016 Krita sprint was sponsored by KDE e.V. (travel) and the Krita Foundation (accomodation and food). Thanks! We also happened to have planned the sprint right for the week the Dutch summer decided to present us with a heatwave. Fortunately we could use a nice and cool cellar. Add some internet and power strips, and it was a great hack and dinner room!



And we’ve got new builds! With a new splash screen! And some bug fixes.


  • Fixed 100% opacity blobs at the start of a line on OSX
  • Don’t allow users to remove the autogenerated gradients
  • Fix a crash when the resource selector tries to display a deleted resource
  • Update the default workspace set
  • Fix exporting animations to the CSV format
  • Fix translations on Windows


On Windows, Krita supports Wacom, Huion and Yiynova tablets, as well as the Surface Pro series of tablets. The portable zip file builds can be unzipped and run by double-clicking the krita link.

Krita on Windows is tested on Windows 7, Windows 8 and Windows 10. There is only a Windows 64 bits build for now. Also, there is debug build that together with the DrMingw debugger can help with finding the cause of crashes. See the new FAQ entry. The Windows builds can be slower than usual because vectorization is disabled.


For Linux, we offer AppImages that should run on any reasonable recent Linux distribution. You can download the appimage, make it executable and run it in place. No installation is needed. At this moment, we only have appimages for 64 bits versions of Linux. This appimage has experimental desktop integration.

You can also get Krita from Ubuntu’s App Store in snap format. This version includes the translations for Krita itself. Install with

snap install --beta krita

OSX and MacOS

Krita on OSX will be fully supported with version 3.1. Krita 3.0 for OSX is still missing Instant Preview and High Quality Canvas scaling. There are also some issues with rendering the image — these issues follow from Apple’s decision to drop support for the OpenGL 3.0 compatibility profile in their display drivers and issue with touchpad and tablet support. We are working to reimplement these features using OpenGL 3.0 Core profile. For now, we recommend disabling OpenGL when using Krita on OSX for production work. Krita for OSX runs on 10.9, 10.10, 10.11 and is reported to work on MacOS too.


The source tarbal contains all translations.


In a couple of days Akademy will start and it’s co-hosted with QtCon in Berlin. This will be a super exciting event bringing together not just KDE folks but also enthusiast people from Qt, FSFE, and VideoLAN.

I’ll be giving a talk on what it’s like to use Qt to compete on the mobile app development market. See you all soon!


It’s finally that time of the year again which many KDE contributors have been looking forward to: The time when we all get together at Akademy, to meet our KDE friends in person, give and listen to talks and make plans for world domination!

This year is special because Akademy will be part of QtCon, a joint conference with Qt, FSFE, VideoLAN and KDAB, which means even greater opportunities to learn something new, reach an audience beyond KDE, and deepen our alliances!

This year, I’ll give three quite different talks:

The first one, on Saturday, is titled “Quo Vadis, KDE? – A FOSS Community’s Journey toward its Vision and Mission“. There I will talk, together with Lydia Pintscher, about how the desire to find a direction for KDE lead to the KDE Evolve initiative, which lead to the KDE Vision and Mission initiatives, and beyond.

The second one, on Sunday, titled “Meet Kirigami UI – How KDE’s new framework can help to create multi-platform mobile and convergent applications” will be a more product-oriented / technical one. Here, Marco Martin and I present our convergent application framework, Kirigami. I will talk about some design background, whereas Marco will go into technical detail and explain how to set up a project that uses Kirigami.

The third talk I’ll be giving (also Sunday), this time together with Jens Reuterberg, is again more on a “meta level”: Under the title “Movements and Products” we will talk about two different mindsets with which contribution to a Free Software community can be approached: A product-focused mindset or a movement-focused mindset. The two are not mutually exclusive, and in fact we’d recommend adopting some of both for a community like KDE to succeed as a movement that creates products.

If you can’t be at QtCon or can’t make it to the talks: I assume the pages linked above will have recordings to download at some point.

Giving talks is not the only thing I do at Akademy / QtCon, of course. There are also all kinds of BoF sessions to attend: On Monday, I’m planning to be at the Plasma BoF (and especially at the Kirigami-focused part in the afternoon, of course), as well as the “Appstream metadata on software releases” BoF (because I was the one who pushed that topic with Aleix). Tuesday morning will be dedicated to Kube, and where I’ll spend the afternoon mainly depends on whether I’ll be elected into the KDE e.V. board at KDE e.V.’s Annual General Meeting this Thursday. Wednesday morning will be all about Discover.

Had I known there were so many important BoFs for me, I’d probably have stayed longer than Wednesday evening, but that wasn’t clear yet at the time I booked my travel, so I’ll have to make as much of the first three BoF days as I can.

Aside from all that, there are of course lots of hallway discussions (Akademy is always great for those!) as well as lots of fun to be had!

It will be great as always, I’m really looking forward to the second half of this week and the first half of the next one.

See you in Berlin!

Filed under: KDE

Last year, I wrote about how library authors should pretty darn well never ever make their users spend time on "porting". Porting is always a waste of time. No matter how important the library author thinks his newly fashionable way of doing stuff is, it is never ever as important as the time porting takes away from the application author's real mission: the work on their applications. I care foremost about my users; I expect a library author to care about their users, i.e, people like me.

So, today I was surprised by Goodbye, Q_FOREACH by Marc Mutz. (Well known for his quixotic crusade to de-Qt Qt.)

Well, fuck.

Marc, none, not a single one of all of the reasons you want to deprecate Q_FOREACH is a reason I care even a little bit about. It's going to be deprecated? Well, that's a decision, and a dumb one. It doesn't work on std containers, QVarLengthArray or C arrays? I don't use it on those. It adds 100 bytes of text size? Piffle. It makes it hard to reason about the loop for you? I don't care.

What I do care is the 1559 places where we use Q_FOREACH in Krita. Porting this will take weeks.

Marc, I hope that you will have a patch ready for us on phabricator soon: you can add it to this project and keep iterating until you've fixed all the bugs.

Happy porting, Marc!

Come into the real world and learn how well this let's-depracate-and-let-the-poor-shmuck-port-their-code attitude works out.

Q_FOREACH (or the alternative form, foreach) will be deprecated soon, probably in Qt 5.9. Starting with Qt 5.7, you can use the QT_NO_FOREACH define to make sure that your code does not depend on Q_FOREACH.

You may have wondered what all the fuss is about. Why is there a continuous stream of commits going to into Qt replacing Q_FOREACH with C++11 ranged for-loops? And why does it take so many commits and several Qt versions to port away from Q_FOREACH? Can’t we just globally search and replace Q_FOREACH (a, b) with for (a : b) and be done with it?

Read on for the answers.

What is Q_FOREACH?

Q_FOREACH is a macro, added for Qt 4, that allows to conveniently iterate over a Qt container:

Q_FOREACH(int i, container)
Q_FOREACH(const QString &s : functionReturningQStringList())

It basically works by copying the second argument into a variable called QForeachContainer, and then iterating over it. I’m only mentioning this for two reasons: First, you will start seeing that internal QForeachContainer at some point in deprecation warnings (probably starting with Qt 5.9), and, second, yes, you heard correctly, it copies the container.

This copying has two effects: First, since the copy taken is essentially const, no detaching happens when iterating, unlike if you use the C++98 or C++11 alternatives:

for (QStringList::const_iterator it = container.begin(), end = container.end(); it != end; ++it)
for (const auto &s : container)

In both cases the (explicit or implicit) calls to begin() and end() cause a non-const container to detach from shared data, ie. to perform a deep-copy to gain a unique copy of the data.

This problem is well-known and there are tools to detect this situation (e.g. Clazy), so I won’t spend more time discussing it. Suffice to say that Q_FOREACH never causes detaches.

Except when it does.

Q_FOREACH is Convenient^WEvil

The second effect of Q_FOREACH taking a copy of the container is that the loop body can freely modify the original container. Here’s a very, very poor implementation that takes advantage of this:

Q_FOREACH(const QString &lang, languages)
    languages += getSynonymsFor(lang);

Of course, since Q_FOREACH took a copy, once you perform the first loop iteration, languages will detach from that copy in Q_FOREACH, but this kind of code is safe when using Q_FOREACH, unlike when you use C++11 ranged for-loops:

for (const auto &lang : languages)
    languages += getSynonymsFor(lang); // undefined behaviour if
                                       // languages.size() + getSynonymsFor(lang).size() > languages.capacity()

So, as we saw, Q_FOREACH is convenient—if you write code.

Things look a bit different if you try to understand code that uses Q_FOREACH, because you often can’t tell whether the copy that Q_FOREACH unconditionally takes is actually needed in any particular case, or not. A loop that plain falls apart if the container is modified while iterating is much easier to reason about than a Q_FOREACH loop.

And this brings us to porting away from Q_FOREACH.

Towards a Q_FOREACH-Free World

Things would be pretty simple if you could just globally search and replace Q_FOREACH (a, b) with for (a : b) and be done with it. But alas, it ain’t so easy…

We now know that the body of a Q_FOREACH loop is free to modify the container it’s iterating over, and don’t even for a minute think that all cases are so easy to recognize as the example with the languages above. The modification of the container may be several functions deep in the call stack originating from the loop body.

So, the first question you need to ask yourself when porting a Q_FOREACH loop is:

Does the loop body (directly or indirectly) modify the container iterated over?

If the answer is yes, you also need to take a copy and iterate over the copy, but as the nice guy that you are, you will leave a comment telling the future You why that copy is necessary:

const auto containerCopy = container; // doSomethingWith() may modify 'container' if ....
for (const auto &e : containerCopy)

I should note that in cases where the container modification is restricted to appends, you can avoid the copy (and the detach caused by it) by using an indexed loop:

for (auto end = languages.size(), i = 0; i != end; ++i) // important: cache 'languages.size()'
    languages += getSynonymsFor(languages[i]);

Avoiding Detaching

If your container is a std:: container or QVarLengthArray, you are done. Arguably, Q_FOREACH should never, ever have been used on such a container, since copying those always copies all elements (deep copy).

If your container is a const lvalue or a const rvalue, you are done, too. Const objects don’t detach, not even the Qt containers.

If your container is a non-const rvalue, simply store it in an automatic const variable, and iterate over that:

const auto strings = functionReturningQStringList();
for (const QString &s : strings)

Last, not least, if your container is a non-const lvalue, you have two choices: Make the container const, or, if that doesn’t work, use std::as_const() or qAsConst() (new in Qt 5.7, but easily implemented yourself, if required) to cast to const:

for (const QString &s : qAsConst(container))

There, no detaches, no unnecessary copies. Maximum efficiency and maximum readability.


Here’s why you’ll want to port away from Q_FOREACH, ideally to C++11 ranged for-loops:

  • Q_FOREACH is going to be deprecated soon.
  • It only works efficiently on (some) Qt containers; it performs prohibitively expensive on all std containers, QVarLengthArray, and doesn’t work at all for C arrays.
  • Even where it works as advertised, it typically costs ~100 bytes of text size more per loop than the C++11 ranged for-loop.
  • Its unconditionally taking a copy of the container makes it hard to reason about the loop.

Happy porting!

The post Goodbye, Q_FOREACH appeared first on KDAB.

Now that Nextcloud 9 is out, many users are already interested in migration so I'd like to address the why and how in this blog post.

Edit: Nextcloud 10 is out with loads of unique features. We now also have a client! You can find out about client account migration here.

Why migrate

Let's start with the why. First, you don't have to migrate yet. This release as well as at least the upcoming releases of own- and Nextcloud will be compatible so you'll be able to migrate between them in the future. We don't want to break compatibility if we can avoid it!

Of course, right now Nextcloud 9 has some extra features and fixes and future releases will introduce other capabilities. With regards to security, we have Lukas Reschke working for us. However, we promise that for the foreseeable future we will continue to report all security issues we find to upstream in advance of any release we do. That means well ahead of our usual public disclosure policy, so security doesn't have to be a reason for people to move.

EditNextcloud 10 comes with far more features on top of this. For Nextcloud 11 we have a ambitious road map already but we'll still enable migration from ownCloud 9.1 to Nextcloud 11 so you can migrate at your leisure!

Migration overview

If you've decided to migrate there are a number of steps to go through:
  • Make sure you have everything set up properly and do a backup
  • Move the old ownCloud install, preserving data and config
  • Extract Nextcloud, correct permissions and put back data and config
  • Switch data and config
  • Trigger the update via command line or the web UI
Note that we don't offer packages. This has been just too problematic in the past and while we might offer some for enterprise distributions, we hope to work together with distributions to create packages for Nextcloud 9 and newer releases. Once that is done we will of course link to those on our installation page.

There are other great resources besides this blog, especially this awesome post on our forums which gives a great and even more detailed overview of a migration with an Ubuntu/NGINX/PHP7/MariaDB setup.

Edit: With regard to packages, there are now packages for CentOS and Fedora and other distributions will likely follow soon. See our packages repository if you want to help!


First, let's check if you're set up properly. Make sure:
  • You are on ownCloud 8.2.3 or later
  • Make sure you have all dependencies
  • Your favorite apps are compatible (with ownCloud 9), you can check this by visiting the app store at
  • You made a backup
Once that's all done, time to move to the next step: cleaning out the old files.

Removing old files

In this step, we'll move the existing installation preserving the data and configuration.
  • Put your server in maintenance mode. Go to the folder ownCloud is installed in and execute sudo -u www-data php occ maintenance:mode --on (www-data has to be your HTTP user). You can also edit your config.php file and changing 'maintenance' => false, to 'maintenance' => true,.
  • Now move the data and config folder out of the way. Best to go to your webserver folder (something like /var/www/htdocs/ and do a mv owncloud owncloud-backup

Deploying Nextcloud

Now, we will put Nextcloud in place.
  • Grab Nextcloud from our download page or use wget: wget
    • Optional: you can verify if the download went correct using our MD5 code, see this page. Run md5sum The output has to match this value: 5ae47c800d1f9889bd5f0075b6dbb3ba
  • Now extract Nextcloud: unzip or tar -xvf nextcloud-9.0.50.tar.bz2
  • Put the config.php file in the right spot: cp owncloud-backup/config/config.php nextcloud/config/config.php
  • Now change the ownership of the files to that of your webserver, for example chown wwwrun:www * -R or chown www-data *
  • If you keep your data/ directory in your owncloud/ directory, copy it to your new nextcloud/ [*]. If you keep it outside of owncloud/ then you don't need to do anything as its location is in config.php.

* Note that if you have been upgrading your server from before ownCloud 6.0 there is a risk that moving the data directory causes issues. It is best to keep the folder with Nextcloud named 'owncloud'. This also avoids having to change all kinds of settings on the server, so it might be a wise choice in any case: rename the nextcloud folder to owncloud.

Now upgrade!

Next up is restarting the webserver and upgrading.
  • Restart your webserver. How depends on your distribution. For example, rcapache2 restart on openSUSE, service restart apache2 on Ubuntu.
  • You can now trigger the update either via OCC or via web. Command line is the most reliable solution. Run it as sudo -u apache php occ upgrade from the nextcloud folder. This has to run as the user of your webserver and thus can also be www-data or www for example.
  • Then, finally, turn of maintenance mode: sudo -u www-data php occ maintenance:mode --off

That's it!

At this point, you'll see the fresh blue of a Nextcloud server! If you encounter any issues with upgrading, discuss them on our forums.

This year I’m giving a bit technical talk called “Bring NetworkManager support to your Qt applications” where I would like to share with you possibilities of NetworkManagerQt framework. I also host a BoF on Monday about Flatpak, where I plan to discuss mostly KDE Flatpak portals so anyone interested in this topic is welcome. Aaaand to be honest, I wrote this blog post just to be able to use the lovely “I’m going to Akademy” banner. See you in Berlin!!


Could you tell us something about yourself?

I’m a self-taught Sunday digital painter from Tokyo, Japan. I publish my artworks under my alias Omiya Tou. Sometimes I’m also a FLOSS tester or translator and lately I’ve committed translations for G’MIC.

Do you paint professionally, as a hobby artist, or both?

Currently I paint completely as a hobby for fun and stress-relief.

What genre(s) do you work in?

I usually do 2D drawings/paintings of manga-styled portraits.

Whose work inspires you most — who are your role models as an artist?

When I create my artwork I always try to convey the tranquility and warmth of Nihon-ga works such as those by Kaii
. Also I have been very much impressed by Yoko Tanji’s versatility in styles.

How and when did you get to try digital painting for the first time?

In 2002, when I was 14, I did what can be called digital painting for the first time.  I think it was a Final Fantasy fanart. In the initial few years I had created my artworks by digitizing pencil linearts with a scanner shared in the school that I was in, taking back the data to my house with floppy discs and coloring them with a mouse and a freeware painting app named Pixia on a Windows XP laptop.

What makes you choose digital over traditional painting?

It would be that digital media are less time- and space-consuming, easier to revise and enable me to focus on the pure joy of painting, leaving every boring task to the computer.


How did you find out about Krita?

I did when I was trying out painting apps available in Ubuntu to which I migrated from Windows to run GIMP at its full speed.

What was your first impression?

When I tried Krita for the first time it was still a sort of tech demo and unsuitable for daily use, but I felt it was worth keeping track of — and actually I’ve been excited seeing how much it has improved since then.

What do you love about Krita?

There are quite a lot of things, but I especially love the quick pop-up palette, numerous drawing assistants, well-tuned preset brushes, and the hyper-energetic dev team. 🙂

What do you think needs improvement in Krita? Is there anything that really annoys you?

For me Krita is just awesome lately, but if I must say something I think it will be nice if the color curves dialog window is resizeable and the measure tool remains visible when another tool is active.

What sets Krita apart from the other tools that you use?

Canvas tilting/flipping, GPU acceleration and CMYK capability.

If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?

I’d pick this because I like the sense of depth and atmosphere:


What techniques and brushes did you use in it?

I colored this by overlaying a solid blue-gray layer which is set to color dodge mode onto a grayscale image. A lot of this is painted using a brush named Bristles_hairy, definitely one of my favorites.

Where can people see more of your work?

I cross-post my works onto deviantART, Tumblr and Flickr.

Anything else you’d like to share?

On my Flickr page I post my artworks under a CC-BY license in their full resolution, so please feel free to use them. Thank you!

Uno de los proyectos killers más interesantes de la Comunidad KDE sigue avanzando.  Y es que ha sido lanzado KDE Connect 1.0, la aplicación “must have” que debemos tener en nuestro KDE ey n nuestro smartphone para simplificar y optimizar la interacción entre ellos. Es el momento de actualizar y ver las novedades que nos ofrece.

Lanzado KDE Connect 1.0

Lanzado KDE Connect 1.0Una de las aplicaciones que debes tener en tu móvil y en tu ordenador si quieres utilizarlos de forma eficiente acaba de lanzar su primera versión definitiva, es decir, la 1.0 (aunque ya sabemos que eso no significa que no fuera utilizable en sus anteriores versiones)

El pasado 26 de agosto Albert Vaca, creador de KDE Connect, publicó una entrada en su blog en el que nos informaba del lanzamiento de la nueva versión de su maravilloso software: KDE Connect. Esta nueva versión viene con muchas novedades importantes, que ha traducido desde su web oficial el magnífico Víctorhck en su blog y que me limito a copiar y pegar, por eso de no duplicar trabajo y compartir conocimientos:

  • Comandos ya preconfigurados, al estilo de “lanzadores” o accesos directos que lo hacen más productivo y rápido a la hora de realizar las tareas más comunes.
  • Puedes responder directamente a SMS desde tu escritorio. Una de las características más esperadas, ya que cuando recibes una notificación de un mensaje de texto en tu escritorio, un botón de “Responder” te permitirá responder al mensaje sin necesidad de utilizar tu teléfono. ¿Mágia? ¡No, simplemente software libre! Por cierto, para disfrutar de esto necesitarás la versión 1.4 de la aplicación Android, ya disponible.
  • Recibe en tu teléfono las notificaciones de tu escritorio. Esta funcionalidad está deshabilitada por defecto, ya que puede llegar a incomodar un poco. Deberás habilitarla tanto en la aplicación de Android como en los Ajustes del Sistema, y podrás escoger qué notificaciones notificar al teléfono y cuales no.
  • KDE Connect gana en un cifrado más robusto, cambiando del método RSA a TLS. Esto además de proteger la comunicación entre ambos dispositivos también mejora la rapidez y consumo de batería que es menor.

Recuerda que para utilizar esta aplicación debes tenerla instalada tanto en tu ordenador (ya expliqué como hacerlo para openSUSE leap 42.1, en otras distribuciones viene de serie) y también en tu smartphone descargándote la aplicación desde F-Droid o la Play Store.

Como vemos, muchas novedades en una de los más afamados proyectos de la Comundidad KDE, en parte fruto del trabajo que se realizan en los Sprints de Randa.

Más información: Albert Vaca’s blog | Víctorhck in the Free World | La mirada del replicante

August 28, 2016

If you want to know what we did in KDE PIM in the last year and what we are planning to do and achieve in the next one come to my KDE PIM Status Report talk on Sunday at 1 PM. If you want to get into more technical details and discussions about KDE PIM there will also be a KDE PIM BoF session on Monday afternoon.

I'm going to Akademy 2016!See you in Berlin!


Weee akademy

QtCon starts on this Sept 1st and I will be traveling to Berlin on this Tuesday. This is second Akademy I am attending after Akademy 2014 in Czech Republic. During QtCon I am giving presentation titled Plasma Mobile: what we achieved in year in room A04 on 2nd September. And also I am going to moderate a Student Presentations where students taking part in various mentoring programs in KDE, like GSoC, GCI, SoK, or OPW.

Also I am taking part in various BoFs starting from Sept 5, like Plasma BoF, KDE SoC, and KDE Neon BoF.

Looking forward to productive QtCon! See you in Berlin! :-)

August 27, 2016

I’ve done a facelift to my website. The new version is more mobile-friendly, modern-looking and quite a departure visually from its original look. I’ve chosen for a newpaper-like, typography-based responsive layout. My site finally also supports SSL, thanks to let’s encrypt.

Going to Akademy
Next week, I’ll be going to Akademy, which is co-hosted with QtCon. As usual, my focus will be around Plasma-related topics. I’ll also hold a presentation about a KDE software store.

In less than a week QtCon will start and Randa Meetings will be part of it. On Saturday at 16:30 you’ll get an interesting story about 7 years of Randa Meetings. Mario Fux will talk about his experience organizing the KDE Tech Summit in the middle of nowhere in the Swiss Alps.

Besides Mario you will have the opportunity to meet and talk with the president of the Randa Meetings association: Simon Wächter. Don’t miss this chance and get a second opinion and maybe some insights.

And now on a different subject, which is something we might have as one of our main topics next year: Accessibility and Personal Information Management (PIM). Accessibility doesn’t just mean to make our software accessible for people with various disabilities, but also to make our software accessible on different operating systems, different devices and with different user interfaces (graphical with keyboard and mouse or touch or speech and other senses). Regarding PIM I don’t think you need more information about what this is: e-mail, contacts, events and all this synchronized and stored in a safe way.

For the case you think it’d make sense to bring another group or topic to Randa as well don’t hesitate to contact us: #randa on IRC, mail, QtCon or via snail mail (see for details).

So if you can contribute something to these topics, work on Free Software in these areas and want to come to Randa next year: please join the date selection Doodle for 2017.

I'm going to Akademy

Flattr this!


I’ll be at QtCon and Akademy in Berlin, mostly as observer, I guess 🙂

I’ll have also two workshops/discussions on Tuesday 6th (see [1]).

Music player

For some time now, there were threads [2] about designing a new music player. The VDG people came up with a vision and some first design ideas [3] and I built a first specification page on the community wiki [4].

I’ll do a workshop which will deal about:

  • what will make this player not just another player
  • feature discussion
  • architecture design
  • use of the public libraries of plasma-mediacenter
  • …what you want to add…

The first 3 points will be mostly a presentation of what I want to do, with discussions about how it can be done better.

I already wrote some code, but it was mostly to make some experimentation, the project was put on hold until this Akademy session, in order to start on the right basis.

Documentation and KApiDox

I also reserved a slot for KApiDox, the program that generates the website. The codebase changed a lot lately, and more and more projects are being generated. However, it’s still not very robust to errors (the whole process would break instead of just ditching the error source) and it appeared not to respond to every usercases.

If you think our API documentation is important and can/should be enhanced, please join the discussion so that I can know your needs (as API user or API writer) and enhance the whole thing in the future.



August 26, 2016

After some time of activity on KBibTeX's master branch, I finally returned to the stable branches to push forwards some releases.

Read more... )

comment count unavailable comments

Colorpick is one of my little side-projects. It is a tool to select colors. It comes with a screen color picker and the ability to check two colors contrast well enough to be used as foreground and background colors of a text.

Contrast check

Three instances of Colorpick showing how the background color can be adjusted to reach a readable text.

The color picker

The color picker in action. The cursor can be moved using either the mouse or the arrow keys.

I wrote this tool a few years ago, using Python 2, PyQt 4 and PyKDE 4. It was time for an update. I started by porting it to Python 3, only to find out that apparently there are no Python bindings for KDE Frameworks...

Colorpick uses a few kdelibs widgets, and some color utilities. I could probably have rewrote those in PyQt 5, but I was looking for a pretext to have a C++ based side-project again, so instead I rewrote it in C++, using Qt5 and a couple of KF5 libraries. The code base is small and PyQt code is often very similar to C++ Qt code so it only took a few 45 mn train commutes to get it ported.

If you are a Colorpick user and were sad to see it still using Qt 4, or if you are looking for a color picker, give it a try!

Older blog entries

Planet KDE is made from the blogs of KDE's contributors. The opinions it contains are those of the contributor. This site is powered by Rawdog and Rawdog RSS. Feed readers can read Planet KDE with RSS, FOAF or OPML.