August 30, 2016





I'm coming back home on Tuesday afternoon-ish so if you want to catch me for something be sure to make it before then :)

See you in Berlin!

Wow ! What a summer !

Roughly 12 weeks have passed since I started working on KDE Now (well formally). Results came out some couple of hours ago and I passed the full term evaluation. In this post, I’m going to talk about what I’m planning for the future of KDE Now. How the past weeks have been and some other stuff.

 

To start with, the summer was awesome. I got to learn so much and that certainly made me a better programmer. Before this summer, I had worked on some parts of large codebases but never really started one from scratch. KDE Now taught me a lot about all the things you have to face when starting a project. The design choices you have to make. Even down to the bottom most nitty gritty details like using const correctness or avoiding copy of data while function calls. All these tiny bits eventually contribute to a better, working, cohesive whole.

 

As for my work, you can find the formal work report here. KDE Now now has a quickgit repository . It’s currently lying in playground and will continue to do so for a couple of months until I have everything sorted out before a possible release. First off, I’m gonna work on documentation. This is to attract other developers to contribute to it. Who wants to work on an undocumented software anyway😛 . Next up, I will write unit tests. I have tested the application but only as a whole in some different scenarios. Unit Testing would be a lot better. But before all that can happen, I want a break. Haha ! I’ve been working for so long on this, that other things took a lower priority. School has started for much long and a lot of things require my attention. So I think, I should take a month off. You can hear me, with other updates from the start of October again. Meanwhile if you have any doubts/questions or want to contribute to KDE Now, I’ll be glad to to answer everything. Just comment below or drop me a mail🙂

 

I am very thankful to my mentor, Ashish Bansal. He is simply an awesome mentor. He would point me to some direction and leave the rest to me for finding out on my own. He would tell me what design choices were better, their trade offs but then again, left it to me to choose on my own. We talked on every other (or two) day. There were some times when I wasn’t very productive but he was pretty cool with that too, as long as I was not terribly behind schedule. I even poked him during his vacation, and he still managed to reply ! I’m really thankful to him. I couldn’t have asked for a better mentor. Also I am thankful to KDE for giving me this opportunity and finally to Google for organizing this awesome program.

 

Cheers !

See you later.


Hello,

Last week we updated the Qt Forum to the latest version of NodeBB.

We had been planning the upgrade for a while, but had to do the upgrade on a quick notice, as a bug that leaked user emails was found in the forum.

This means that it was possible for someone to find out user emails from the forum. For those users who have their email as public, this is not a issue, but some of you want to keep your email to yourself. The bug meant that these email addresses could also be found.

No other data was available through the bug, and as we are using a central sign in service, no account information could leak from the forum.

So if you have gotten more email spam than normally this might be one cause.

We are sorry for the leak, but in our defence, we did not know of it, and patched the system in under a day of becoming aware of the issue.

But on to the upgrade itself.

With the upgrade we changed to the new default theme used by NodeBB. It looks quite different from the old theme, and has already gotten some for and against feedback. I personally am getting used to the look and feel, and after the initial shock, I like it. That’s a personal opinion, your mileage may vary, and please do tell us in the comments.

Due to the rushed upgrade some small things still need tweaking, the colours are a bit off from the Qt green, that will be fixed as soon as I find the time for it.

The reasoning for updating the theme, is that we can now follow the NodeBB upgrades faster, as we do not need to customise the theme as much as before. This will bring the improvement faster to you.

As an example new feature we now have chat rooms instead of one-to-one chats on the forum. To create a room, you can start a chat, and from the chat window settings add other users. At least for the Forum regulars this is quite an improvement.

So what do you think of the new Qt Forum look? Please tell us in the comments or drop by the forum to share your opinion.

The post New Forum theme and security notice appeared first on Qt Blog.

Last week, I spent 4 days at the Krita Sprint in Deventer, where several contributors gathered to discuss the current hot topics, draw and hack together.

You can read a global report of the event on krita.org news.

On my side, besides meeting old and new friends, and discussing animation, brushes and vector stuff, I made three commits:
-replace some duplicate icons by aliases in qrc files
-update the default workspaces
-add a new “Eraser Switch Opacity” feature (this one is on a separate branch for now)

I also filed new tasks on phabricator for two feature requests to improve some color and animation workflow:

https://phabricator.kde.org/T3542

https://phabricator.kde.org/T3543

Once again, I feel it’s been a great and productive meeting for everyone. A lot of cool things are ready for next Krita version, this is exciting! So much thanks to KDE e.V. for the travel support, and to the Krita foundation for hosting the event and providing accomodation and food.

On Friday QtCon starts and there will be of course an update about the current state of Wayland support in Plasma. See you during the lightning talk session on Friday between 17:30 and 18:30 for my lightning talk “We are in Wayland!

La noticia es algo vieja, algo que empieza a ser costumbre en el blog en un final de mes de agosto inusualmente cargado de noticias frescas. Así que hoy me toca anunciar algo que supongo que todo el mundo conocerá, y es que a mediados del mes un bombazo sacudió la Comunidad cuando se anunció que Canonical sería un nuevo patrocinador de KDE. Y, para completar el artículo, compartiré con vosotros mi reflexión personal, desde el punto de vista de un usuario que no conoce todos los entresijos del mundillo de Software Libre.

Canonical nuevo patrocinador de KDE e.V.

Canonical nuevo patrocinador de KDEPara los que vivan en una cueva o han estado de vacaciones quizás sea una noticia que les sorprenda: Canonical se convirtió en patrocinador de KDE a mediados del mes de agosto. Este anunció sorprendió a propios y extraños por la irregular relación que tiene Canonical con el proyecto KDE.

No olvidemos que Kubuntu siempre ha sido una distribución extraña dentro de la familia xUbuntu, que esta distribución pasó de ser “oficial” a pertenecer enteramente a la Comunidad hace unos años y que recientemente parte de su equipo abandonó el proyecto y creó KDE Neon.

Con todos estos antecedentes no parecía que la relación Canonical-KDE pasara por un buen momento, pero la vida nos da sorpresas y el pasado 18 de agosto en KDE.News, canal de comunicación oficial de KDE, se nos anunció que Canonical se convertía en un nuevo patrocinador del proyecto.

En términos prácticos que significa esto: que Canonical aportará 10000 € anuales a KDE para promocionar el desarrollo de Software Libre a cambio de aparecer como patrocinador de KDE en la web (y otros medios) y potenciar la adopción de “paquetes Snap en los espacios de trabajo y aplicaciones de KDE para que sean fácilmente instalables por usuarios de cualquier distribución Linux”, en palabras de Michael Hall, Community Manager de Ubuntu.

Por otra parte, la gente de KDE e.V. valora muy positivamente la noticia, y en palabras de Aleix Pol, vicepresidente de la fundación, “es importante hacer continuos esfuerzos juntos, ya que es la mejor manera de seguir ofreciendo una plataforma de desarrollo libre y abierta, y confía en que la colaboración sea beneficiosa para toda la comunidad y el ecosistema de GNU/Linux.”

 

Más información: KDE.News | Genbeta

Mi reflexión personal

Hay gente que se ha echado las manos a la cabeza pero en realidad no debería sorprendernos que una empresa patrocine proyectos de Software Libre pensando que de esta forma se pierde independencia. En mi opinión no creo que esto ocurra. Sin ir más lejos KDE está patrocinada por BlueSystems, Google, Suse y The Qt Company, con lo que Canonical no hace más que sumarse de forma visible al proyecto KDE, y hasta la fecha, que yo sepa, no se ha visto influido por decisiones empresariales.

Bajo mi punto de visto, siempre que los dirigentes de los proyectos tengan claro que su trabajo es por y para la Comunidad no debería haber ningún problema. Ahora bien, como amantes del proyecto saben que para que éste siga creciendo hace falta financiación: se necesita mantener unas estructuras físicas como servidores, se necesita potenciar las relaciones humanas con los eventos, se necesita tener toda la burocracia en regla, se necesita material para estar presente en otros eventos  y, si es posible, tener gente contratada a tiempo completo trabajando por el proyecto.

kde_community

Todo esto cuesta dinero y se debe sacar de algún sitio. A mi parecer en la actualidad solo existen dos formas viables para conseguir fondos: donaciones de la Comunidad (como las campañas para los Sprints de Randa o las recaudaciones de Krita) o patrocinios de empresas. A mi entender existen dos más: publicidad, la cual no es nada deseable, o apoyo de instituciones públicas (gobiernos, ayuntamientos, universidades, etc), una vía que en mi opinión debería estar más explotada y que quizás estamos desaprovechando.

Evidentemente, una vía de financiación no excluye a la otras, así que si hay empresas que quieran patrocinan KDE y éstas tienen vínculos conocidos con el Software Libre no veo ningún inconveniente.Y es que compañías de Software quieran unir su nombre a KDE no es nada malo, solo un síntoma que el proyecto KDE tiene prestigio, destila calidad y tiene mucho futuro.

Ahora bien, tampoco sería conveniente que éste se convirtiera en la única vía. Los proyectos comunitarios son un reflejo de su Comunidad, así que la participación de la Comunidad es extraordinariamente importante. Los integrantes de la Comunidad KDE deben tener claro que formar parte de una Comunidad no es recibir todo sin dar nada a cambio, y el “pago” que se les pide es la participación en todos los aspectos del Proyecto que puedan: participar en las donaciones, escribir código, promocionar el proyecto, etc.

20 formas de colaborar con KDE

En definitiva, creo que está bien que se sumen empresas al patrocinio siempre y cuando no dicten las líneas de desarrollo del proyecto. No obstante creo también que para que esto no acabe ocurriendo KDE debe tener una Comunidad implicada en el proyecto que sea activa, participativa, dinámica y emprendedora.

 

So, yeah. I failed the final evaluations of Google Summer of Code. I didn't expect it, but I think I should have added more features to my project and worked harder. To be honest, this was one hell of a summer. Working on such a project, OfflineExtension for WikiEditor, with such an awesome community WikiToLearn and such cool mentors, Gianluca Rigoletti and Irene Cortinovis, is worth millions. I couldn't pass the final evaluations, but I gained so so much that I have no hard feelings. Of course, I felt bad, but at the end, the experience is all that matters. Thanks, Google and WikiToLearn for giving me such an amazing opportunity. I would work more on my project and try to get it in the production mode, as soon as possible. Also, I will reapply for Google Summer of Code next year too with the same organization, and will definitely pass :'). And, the bad part ends here. Good news is that I'm now a core developer of WikiToLearn, and I find this opportunity to be more worthy than being a GSoCer, so this.failure eventually equals to this.success (Sorry for this, I'm learning JAVA :P ). Thanks, Riccardo and Gianluca for giving me this oppurtunity. I’ll give my best, work harder, learn and will contribute to  as much as I can. :)

Long Live WikiToLearn, Long Live KDE, Long Live Mediawiki and Long Live FOSSatAmrita! :)

August 29, 2016

That’s it. After a combined total of 217 git commits, 6,202 lines of code added, and 4,167 lines of code deleted, GSoC 2016 is finally over.


These twelve weeks of programming have been a very enriching experience for me and making this project has taught me a lot about production quality software development. Little did I know that a small project I had put together in a 6 hour session of messing around with Qt would lead to something as big as this!

There have been many memorable moments throughout my coding period for the GSoC - such as the first time I got an ioslave to install correctly, to writing its “Hello World” equivalent, and getting a basic implementation of the project up and running by doing a series of dirty hacks with Dolphin’s code. There were also times when I was so frustrated with debugging for this project, that I wanted to do nothing but smash my laptop’s progressively failing display panel with the largest hammer I could find. The great mentorship from my GSoC mentor and the premise of the GSoC program itself kept me going. This also taught me an important lesson with regards to software development - no one starting out gets it right on their first try. It feels like after a long run of not quite getting the results I wanted, the GSoC is the thing which worked out for me as everything just fell into place.

There’s a technical digression here, which you can feel free to skip through if you don’t want to get into the details of the project.

Following up from the previous blog post, with the core features of the application complete, I had moved on to unit testing my project. For this project, unit testing involved writing test cases for each and every component of the application to find and fix any bugs. Despite the innocuous name, unit testing this project was a much bigger challenge than I expected. As for one thing, the ioslave in my project is merely a controller in the MVC system of the virtual Stash File System, the Dolphin file manager, and the KIO Slave itself. Besides, most of the ioslave’s functions have a void return type, so feeding the slave’s functions’ arguments to get an output for checking was not an option either.

This led me to use an approach, which my mentor aptly called “black box testing”.

In this approach, one writes unit tests testing for a specific action and then checking for whether the effects of the said action are as expected. In this case, the ioslave was tested by giving it a test file and then apply some of the ioslave’s functions such as copy, rename, delete, and stat. From there, through a bunch of QVERIFY calls is to check whether the ioslave has completed the operation successfully. Needless to say, this approach is far more convoluted to write unit tests for as it required checking each and every test file for its properties in every test case. Fortunately, the QTestLib API is pretty well documented so it wasn’t difficult to get started with writing unit tests. I also had a template of what a good test suite should look like thanks to David Faure’s excellent work on implementing automated unit testing for the Trash ioslave. With these two tools in hand, I started off with writing unit tests shortly before the second year of college started.

As expected, writing black box unit tests was a PITA in its own right. The first time I ran my unit test I came up with a dismal score of 6 unit tests passed out of the 17 I had written. This lead me to go back and check whether my unit tests were testing correctly at all. It turned out that I had made so many mistakes with writing the unit tests that an entire rewrite of the test suite wasn’t unwarranted.

With a rewrite of the test cases completed, I ran the test suite again. The results were a bit better - 13 out of the 17 test cases passed, but 4 failed test cases - enough reason for the project to be unshippable. Looking into the issue a bit deeper, I found out that all the D-Bus calls to my ioslave for copy and move operations were not working correctly! Given that I had spent so much time on making sure the ioslave was robust enough, this was a mixed surprise. Finally, after a week of rewriting and to an extent, refactoring the rename and copy functions of the ioslave, I got the best terminal output I ever wanted from this project.


Definitely the highest point of the GSoC for me. From there on out, it was a matter of putting the code on the slow burner for cleaning up any leftover debug statements and for writing documentation for other obscure sections. With a net total of nearly 2000 lines of code, it far surpasses any other project I’ve done in terms of size and quality of code written.

At some points in the project, I felt that the stipend was far too generous, for many people working on KDE voluntarily produce projects much larger thann mine. In the end, I feel the best way to repay the generosity is to continue with open source development - just as the GSoC intended. Prior to the GSoC, open source was simply an interesting concept to me, but contributing a couple of thousands of lines of code to an open source codebase has made me realise just how powerful open source is. There were no restrictions on when I had to work (usually my productivity was at its peak at outlandish late night hours), on the machine I used for coding (a trusty IdeaPad, replaced with a much nicer ThinkPad), or on the place where I felt most comfortable coding from (a toss up between my much used study table or the living room). In many ways, working from home was probably the best environment I could ask for when it came to working on this project. Hacking on an open source project gave me a sense of gratification solving a problem in competitive programming never could have.

The Google Summer of Code may be over, but my journey with open source development has just begun. Here’s to even bigger and better projects in the future!

Kdenlive logo
Kdenlive 16.08.0 was released a few weeks ago and we are now preparing the bugfix 16.08.1 release that will be tagged on the 5th of September.
In the recent weeks I have not been able to spend much time on Kdenlive. Hopefully, in a few days I will be in Berlin for Akademy, which is always a great opportunity to move forwards. For everyone interested in discovering Kdenlive, discussing about your successes or problems with it, or if you want to get involved, feel free to join the Kdenlive’s BoF session on Monday, 5th of September.

Jean-Baptiste Mardelle


The artists, developers, website maintainers and documentation writers who were in Deventer for the Krita sprint  since Thursday are now slowly returning home. Some will stay for a bit longer, others had to leave on Sunday already.

DSCF5848

There is something extremely exhausting and exhilarating about a real-live meeting like this! Lots and lots of topics were discussed:

  • We discussed, are actually still discussing, the user interaction design for the vector and text projects this year’s kickstarter funded.
  • Lots of work was done to support OSX properly — the opengl patch looks ready to land in Qt! There were also some fixes to tablet handling.
  • We had all three Google Summer of Code students present! Wolthera finished the second part of her project (the first was soft proofing and is in 3.0.1): a new color selector internal to Krita that is fully color managed. Jouni presented his animation work and Julian his work on Qt’s OpenGL QPainter engine.

index

 

  • We discussed the publication of a Pepper and Carrot book by the Krita Foundation with David Revoy.
Pepper loves Kiki!

Pepper loves Kiki!

  • We refined the release process and the process by which we take feature requests all the way to implemented and released features.
  • Jouni sat together with Steven, the author of this post’s sketches and who is also an accomplished animator to go through Krita’s animation workflow
  • We discussed how badly we need a new architecture and UX design for resource management
  • We made plans for improving the website and the webshop
  • Dmitry showed a new brush engine (alcohol markers) that can handle enormous brush diameters — 2500 pixels isn’t impossible
  • We fixed bugs, bugs and bugs….

Kiki_Angel &Demon

  • And finally, we had a great time. Many dedicated contributors to Krita had never met in person before, and now we’ve got faces and voices mapped to chat channel nicknames and commit message email addresses!

The 2016 Krita sprint was sponsored by KDE e.V. (travel) and the Krita Foundation (accomodation and food). Thanks! We also happened to have planned the sprint right for the week the Dutch summer decided to present us with a heatwave. Fortunately we could use a nice and cool cellar. Add some internet and power strips, and it was a great hack and dinner room!

DSCF5836

Builds

And we’ve got new builds! With a new splash screen! And some bug fixes.

electrichearts_20160517_20160820_kiki_02

  • Fixed 100% opacity blobs at the start of a line on OSX
  • Don’t allow users to remove the autogenerated gradients
  • Fix a crash when the resource selector tries to display a deleted resource
  • Update the default workspace set
  • Fix exporting animations to the CSV format
  • Fix translations on Windows

Windows

On Windows, Krita supports Wacom, Huion and Yiynova tablets, as well as the Surface Pro series of tablets. The portable zip file builds can be unzipped and run by double-clicking the krita link.

Krita on Windows is tested on Windows 7, Windows 8 and Windows 10. There is only a Windows 64 bits build for now. Also, there is debug build that together with the DrMingw debugger can help with finding the cause of crashes. See the new FAQ entry. The Windows builds can be slower than usual because vectorization is disabled.

Linux

For Linux, we offer AppImages that should run on any reasonable recent Linux distribution. You can download the appimage, make it executable and run it in place. No installation is needed. At this moment, we only have appimages for 64 bits versions of Linux. This appimage has experimental desktop integration.

You can also get Krita from Ubuntu’s App Store in snap format. This version includes the translations for Krita itself. Install with

snap install --beta krita

OSX and MacOS

Krita on OSX will be fully supported with version 3.1. Krita 3.0 for OSX is still missing Instant Preview and High Quality Canvas scaling. There are also some issues with rendering the image — these issues follow from Apple’s decision to drop support for the OpenGL 3.0 compatibility profile in their display drivers and issue with touchpad and tablet support. We are working to reimplement these features using OpenGL 3.0 Core profile. For now, we recommend disabling OpenGL when using Krita on OSX for production work. Krita for OSX runs on 10.9, 10.10, 10.11 and is reported to work on MacOS too.

Source

The source tarbal contains all translations.

 

In a couple of days Akademy will start and it’s co-hosted with QtCon in Berlin. This will be a super exciting event bringing together not just KDE folks but also enthusiast people from Qt, FSFE, and VideoLAN.

I’ll be giving a talk on what it’s like to use Qt to compete on the mobile app development market. See you all soon!

going-to-akademy-2016

It’s finally that time of the year again which many KDE contributors have been looking forward to: The time when we all get together at Akademy, to meet our KDE friends in person, give and listen to talks and make plans for world domination!

This year is special because Akademy will be part of QtCon, a joint conference with Qt, FSFE, VideoLAN and KDAB, which means even greater opportunities to learn something new, reach an audience beyond KDE, and deepen our alliances!

This year, I’ll give three quite different talks:

The first one, on Saturday, is titled “Quo Vadis, KDE? – A FOSS Community’s Journey toward its Vision and Mission“. There I will talk, together with Lydia Pintscher, about how the desire to find a direction for KDE lead to the KDE Evolve initiative, which lead to the KDE Vision and Mission initiatives, and beyond.

The second one, on Sunday, titled “Meet Kirigami UI – How KDE’s new framework can help to create multi-platform mobile and convergent applications” will be a more product-oriented / technical one. Here, Marco Martin and I present our convergent application framework, Kirigami. I will talk about some design background, whereas Marco will go into technical detail and explain how to set up a project that uses Kirigami.

The third talk I’ll be giving (also Sunday), this time together with Jens Reuterberg, is again more on a “meta level”: Under the title “Movements and Products” we will talk about two different mindsets with which contribution to a Free Software community can be approached: A product-focused mindset or a movement-focused mindset. The two are not mutually exclusive, and in fact we’d recommend adopting some of both for a community like KDE to succeed as a movement that creates products.

If you can’t be at QtCon or can’t make it to the talks: I assume the pages linked above will have recordings to download at some point.

Giving talks is not the only thing I do at Akademy / QtCon, of course. There are also all kinds of BoF sessions to attend: On Monday, I’m planning to be at the Plasma BoF (and especially at the Kirigami-focused part in the afternoon, of course), as well as the “Appstream metadata on software releases” BoF (because I was the one who pushed that topic with Aleix). Tuesday morning will be dedicated to Kube, and where I’ll spend the afternoon mainly depends on whether I’ll be elected into the KDE e.V. board at KDE e.V.’s Annual General Meeting this Thursday. Wednesday morning will be all about Discover.

Had I known there were so many important BoFs for me, I’d probably have stayed longer than Wednesday evening, but that wasn’t clear yet at the time I booked my travel, so I’ll have to make as much of the first three BoF days as I can.

Aside from all that, there are of course lots of hallway discussions (Akademy is always great for those!) as well as lots of fun to be had!

It will be great as always, I’m really looking forward to the second half of this week and the first half of the next one.

See you in Berlin!


Filed under: KDE

Last year, I wrote about how library authors should pretty darn well never ever make their users spend time on "porting". Porting is always a waste of time. No matter how important the library author thinks his newly fashionable way of doing stuff is, it is never ever as important as the time porting takes away from the application author's real mission: the work on their applications. I care foremost about my users; I expect a library author to care about their users, i.e, people like me.

So, today I was surprised by Goodbye, Q_FOREACH by Marc Mutz. (Well known for his quixotic crusade to de-Qt Qt.)

Well, fuck.

Marc, none, not a single one of all of the reasons you want to deprecate Q_FOREACH is a reason I care even a little bit about. It's going to be deprecated? Well, that's a decision, and a dumb one. It doesn't work on std containers, QVarLengthArray or C arrays? I don't use it on those. It adds 100 bytes of text size? Piffle. It makes it hard to reason about the loop for you? I don't care.

What I do care is the 1559 places where we use Q_FOREACH in Krita. Porting this will take weeks.

Marc, I hope that you will have a patch ready for us on phabricator soon: you can add it to this project and keep iterating until you've fixed all the bugs.

Happy porting, Marc!

Come into the real world and learn how well this let's-depracate-and-let-the-poor-shmuck-port-their-code attitude works out.

Q_FOREACH (or the alternative form, foreach) will be deprecated soon, probably in Qt 5.9. Starting with Qt 5.7, you can use the QT_NO_FOREACH define to make sure that your code does not depend on Q_FOREACH.

You may have wondered what all the fuss is about. Why is there a continuous stream of commits going to into Qt replacing Q_FOREACH with C++11 ranged for-loops? And why does it take so many commits and several Qt versions to port away from Q_FOREACH? Can’t we just globally search and replace Q_FOREACH (a, b) with for (a : b) and be done with it?

Read on for the answers.

What is Q_FOREACH?

Q_FOREACH is a macro, added for Qt 4, that allows to conveniently iterate over a Qt container:

Q_FOREACH(int i, container)
    doSomethingWith(i);
Q_FOREACH(const QString &s : functionReturningQStringList())
    doSomethingWith(s);

It basically works by copying the second argument into a variable called QForeachContainer, and then iterating over it. I’m only mentioning this for two reasons: First, you will start seeing that internal QForeachContainer at some point in deprecation warnings (probably starting with Qt 5.9), and, second, yes, you heard correctly, it copies the container.

This copying has two effects: First, since the copy taken is essentially const, no detaching happens when iterating, unlike if you use the C++98 or C++11 alternatives:

for (QStringList::const_iterator it = container.begin(), end = container.end(); it != end; ++it)
   doSomethingWith(*it);
for (const auto &s : container)
   doSomethingWith((*it);

In both cases the (explicit or implicit) calls to begin() and end() cause a non-const container to detach from shared data, ie. to perform a deep-copy to gain a unique copy of the data.

This problem is well-known and there are tools to detect this situation (e.g. Clazy), so I won’t spend more time discussing it. Suffice to say that Q_FOREACH never causes detaches.

Except when it does.

Q_FOREACH is Convenient^WEvil

The second effect of Q_FOREACH taking a copy of the container is that the loop body can freely modify the original container. Here’s a very, very poor implementation that takes advantage of this:

Q_FOREACH(const QString &lang, languages)
    languages += getSynonymsFor(lang);

Of course, since Q_FOREACH took a copy, once you perform the first loop iteration, languages will detach from that copy in Q_FOREACH, but this kind of code is safe when using Q_FOREACH, unlike when you use C++11 ranged for-loops:

for (const auto &lang : languages)
    languages += getSynonymsFor(lang); // undefined behaviour if
                                       // languages.size() + getSynonymsFor(lang).size() > languages.capacity()

So, as we saw, Q_FOREACH is convenient—if you write code.

Things look a bit different if you try to understand code that uses Q_FOREACH, because you often can’t tell whether the copy that Q_FOREACH unconditionally takes is actually needed in any particular case, or not. A loop that plain falls apart if the container is modified while iterating is much easier to reason about than a Q_FOREACH loop.

And this brings us to porting away from Q_FOREACH.

Towards a Q_FOREACH-Free World

Things would be pretty simple if you could just globally search and replace Q_FOREACH (a, b) with for (a : b) and be done with it. But alas, it ain’t so easy…

We now know that the body of a Q_FOREACH loop is free to modify the container it’s iterating over, and don’t even for a minute think that all cases are so easy to recognize as the example with the languages above. The modification of the container may be several functions deep in the call stack originating from the loop body.

So, the first question you need to ask yourself when porting a Q_FOREACH loop is:

Does the loop body (directly or indirectly) modify the container iterated over?

If the answer is yes, you also need to take a copy and iterate over the copy, but as the nice guy that you are, you will leave a comment telling the future You why that copy is necessary:

const auto containerCopy = container; // doSomethingWith() may modify 'container' if ....
for (const auto &e : containerCopy)
    doSomethingWith(e);

I should note that in cases where the container modification is restricted to appends, you can avoid the copy (and the detach caused by it) by using an indexed loop:

for (auto end = languages.size(), i = 0; i != end; ++i) // important: cache 'languages.size()'
    languages += getSynonymsFor(languages[i]);

Avoiding Detaching

If your container is a std:: container or QVarLengthArray, you are done. Arguably, Q_FOREACH should never, ever have been used on such a container, since copying those always copies all elements (deep copy).

If your container is a const lvalue or a const rvalue, you are done, too. Const objects don’t detach, not even the Qt containers.

If your container is a non-const rvalue, simply store it in an automatic const variable, and iterate over that:

const auto strings = functionReturningQStringList();
for (const QString &s : strings)
    doSomethingWith(s);

Last, not least, if your container is a non-const lvalue, you have two choices: Mark the variable const, or, if that doesn’t work, use std::as_const() or qAsConst() (new in Qt 5.7, but easily implemented yourself, if required) to cast to const:

for (const QString &s : qAsConst(container))
    doSomethingWith(s);

There, no detaches, no unnecessary copies. Maximum efficiency and maximum readability.

Conclusion

Here’s why you’ll want to port away from Q_FOREACH, ideally to C++11 ranged for-loops:

  • Q_FOREACH is going to be deprecated soon.
  • It only works efficiently on (some) Qt containers; it performs prohibitively expensive on all std containers, QVarLengthArray, and doesn’t work at all for C arrays.
  • Even where it works as advertised, it typically costs ~100 bytes of text size more per loop than the C++11 ranged for-loop.
  • Its unconditionally taking a copy of the container makes it hard to reason about the loop.

Happy porting!

The post Goodbye, Q_FOREACH appeared first on KDAB.

Now that Nextcloud 9 is out, many users are already interested in migration so I'd like to address the why and how in this blog post.

Edit: Nextcloud 10 is out with loads of unique features. We now also have a client! You can find out about client account migration here.

Why migrate

Let's start with the why. First, you don't have to migrate yet. This release as well as at least the upcoming releases of own- and Nextcloud will be compatible so you'll be able to migrate between them in the future. We don't want to break compatibility if we can avoid it!

Of course, right now Nextcloud 9 has some extra features and fixes and future releases will introduce other capabilities. With regards to security, we have Lukas Reschke working for us. However, we promise that for the foreseeable future we will continue to report all security issues we find to upstream in advance of any release we do. That means well ahead of our usual public disclosure policy, so security doesn't have to be a reason for people to move.

EditNextcloud 10 comes with far more features on top of this. For Nextcloud 11 we have a ambitious road map already but we'll still enable migration from ownCloud 9.1 to Nextcloud 11 so you can migrate at your leisure!

Migration overview

If you've decided to migrate there are a number of steps to go through:
  • Make sure you have everything set up properly and do a backup
  • Move the old ownCloud install, preserving data and config
  • Extract Nextcloud, correct permissions and put back data and config
  • Switch data and config
  • Trigger the update via command line or the web UI
Note that we don't offer packages. This has been just too problematic in the past and while we might offer some for enterprise distributions, we hope to work together with distributions to create packages for Nextcloud 9 and newer releases. Once that is done we will of course link to those on our installation page.

There are other great resources besides this blog, especially this awesome post on our forums which gives a great and even more detailed overview of a migration with an Ubuntu/NGINX/PHP7/MariaDB setup.

Edit: With regard to packages, there are now packages for CentOS and Fedora and other distributions will likely follow soon. See our packages repository if you want to help!

Preparation

First, let's check if you're set up properly. Make sure:
  • You are on ownCloud 8.2.3 or later
  • Make sure you have all dependencies
  • Your favorite apps are compatible (with ownCloud 9), you can check this by visiting the app store at apps.owncloud.com
  • You made a backup
Once that's all done, time to move to the next step: cleaning out the old files.

Removing old files

In this step, we'll move the existing installation preserving the data and configuration.
  • Put your server in maintenance mode. Go to the folder ownCloud is installed in and execute sudo -u www-data php occ maintenance:mode --on (www-data has to be your HTTP user). You can also edit your config.php file and changing 'maintenance' => false, to 'maintenance' => true,.
  • Now move the data and config folder out of the way. Best to go to your webserver folder (something like /var/www/htdocs/ and do a mv owncloud owncloud-backup

Deploying Nextcloud

Now, we will put Nextcloud in place.
  • Grab Nextcloud from our download page or use wget: wget https://download.nextcloud.com/server/releases/nextcloud-9.0.50.zip
    • Optional: you can verify if the download went correct using our MD5 code, see this page. Run md5sum nextcloud-9.0.50.zip. The output has to match this value: 5ae47c800d1f9889bd5f0075b6dbb3ba
  • Now extract Nextcloud: unzip nextcloud-9.0.50.zip or tar -xvf nextcloud-9.0.50.tar.bz2
  • Put the config.php file in the right spot: cp owncloud-backup/config/config.php nextcloud/config/config.php
  • Now change the ownership of the files to that of your webserver, for example chown wwwrun:www * -R or chown www-data *
  • If you keep your data/ directory in your owncloud/ directory, copy it to your new nextcloud/ [*]. If you keep it outside of owncloud/ then you don't need to do anything as its location is in config.php.

* Note that if you have been upgrading your server from before ownCloud 6.0 there is a risk that moving the data directory causes issues. It is best to keep the folder with Nextcloud named 'owncloud'. This also avoids having to change all kinds of settings on the server, so it might be a wise choice in any case: rename the nextcloud folder to owncloud.

Now upgrade!

Next up is restarting the webserver and upgrading.
  • Restart your webserver. How depends on your distribution. For example, rcapache2 restart on openSUSE, service restart apache2 on Ubuntu.
  • You can now trigger the update either via OCC or via web. Command line is the most reliable solution. Run it as sudo -u apache php occ upgrade from the nextcloud folder. This has to run as the user of your webserver and thus can also be www-data or www for example.
  • Then, finally, turn of maintenance mode: sudo -u www-data php occ maintenance:mode --off

That's it!

At this point, you'll see the fresh blue of a Nextcloud server! If you encounter any issues with upgrading, discuss them on our forums.

This year I’m giving a bit technical talk called “Bring NetworkManager support to your Qt applications” where I would like to share with you possibilities of NetworkManagerQt framework. I also host a BoF on Monday about Flatpak, where I plan to discuss mostly KDE Flatpak portals so anyone interested in this topic is welcome. Aaaand to be honest, I wrote this blog post just to be able to use the lovely “I’m going to Akademy” banner. See you in Berlin!!

omiya_3

Could you tell us something about yourself?

I’m a self-taught Sunday digital painter from Tokyo, Japan. I publish my artworks under my alias Omiya Tou. Sometimes I’m also a FLOSS tester or translator and lately I’ve committed translations for G’MIC.

Do you paint professionally, as a hobby artist, or both?

Currently I paint completely as a hobby for fun and stress-relief.

What genre(s) do you work in?

I usually do 2D drawings/paintings of manga-styled portraits.

Whose work inspires you most — who are your role models as an artist?

When I create my artwork I always try to convey the tranquility and warmth of Nihon-ga works such as those by Kaii
Higashiyama
. Also I have been very much impressed by Yoko Tanji’s versatility in styles.

How and when did you get to try digital painting for the first time?

In 2002, when I was 14, I did what can be called digital painting for the first time.  I think it was a Final Fantasy fanart. In the initial few years I had created my artworks by digitizing pencil linearts with a scanner shared in the school that I was in, taking back the data to my house with floppy discs and coloring them with a mouse and a freeware painting app named Pixia on a Windows XP laptop.

What makes you choose digital over traditional painting?

It would be that digital media are less time- and space-consuming, easier to revise and enable me to focus on the pure joy of painting, leaving every boring task to the computer.

omiya_2

How did you find out about Krita?

I did when I was trying out painting apps available in Ubuntu to which I migrated from Windows to run GIMP at its full speed.

What was your first impression?

When I tried Krita for the first time it was still a sort of tech demo and unsuitable for daily use, but I felt it was worth keeping track of — and actually I’ve been excited seeing how much it has improved since then.

What do you love about Krita?

There are quite a lot of things, but I especially love the quick pop-up palette, numerous drawing assistants, well-tuned preset brushes, and the hyper-energetic dev team. 🙂

What do you think needs improvement in Krita? Is there anything that really annoys you?

For me Krita is just awesome lately, but if I must say something I think it will be nice if the color curves dialog window is resizeable and the measure tool remains visible when another tool is active.

What sets Krita apart from the other tools that you use?

Canvas tilting/flipping, GPU acceleration and CMYK capability.

If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?

I’d pick this because I like the sense of depth and atmosphere:

omiya_1

What techniques and brushes did you use in it?

I colored this by overlaying a solid blue-gray layer which is set to color dodge mode onto a grayscale image. A lot of this is painted using a brush named Bristles_hairy, definitely one of my favorites.

Where can people see more of your work?

I cross-post my works onto deviantART, Tumblr and Flickr.

Anything else you’d like to share?

On my Flickr page I post my artworks under a CC-BY license in their full resolution, so please feel free to use them. Thank you!

Uno de los proyectos killers más interesantes de la Comunidad KDE sigue avanzando.  Y es que ha sido lanzado KDE Connect 1.0, la aplicación “must have” que debemos tener en nuestro KDE ey n nuestro smartphone para simplificar y optimizar la interacción entre ellos. Es el momento de actualizar y ver las novedades que nos ofrece.

Lanzado KDE Connect 1.0

Lanzado KDE Connect 1.0Una de las aplicaciones que debes tener en tu móvil y en tu ordenador si quieres utilizarlos de forma eficiente acaba de lanzar su primera versión definitiva, es decir, la 1.0 (aunque ya sabemos que eso no significa que no fuera utilizable en sus anteriores versiones)

El pasado 26 de agosto Albert Vaca, creador de KDE Connect, publicó una entrada en su blog en el que nos informaba del lanzamiento de la nueva versión de su maravilloso software: KDE Connect. Esta nueva versión viene con muchas novedades importantes, que ha traducido desde su web oficial el magnífico Víctorhck en su blog y que me limito a copiar y pegar, por eso de no duplicar trabajo y compartir conocimientos:

  • Comandos ya preconfigurados, al estilo de “lanzadores” o accesos directos que lo hacen más productivo y rápido a la hora de realizar las tareas más comunes.
  • Puedes responder directamente a SMS desde tu escritorio. Una de las características más esperadas, ya que cuando recibes una notificación de un mensaje de texto en tu escritorio, un botón de “Responder” te permitirá responder al mensaje sin necesidad de utilizar tu teléfono. ¿Mágia? ¡No, simplemente software libre! Por cierto, para disfrutar de esto necesitarás la versión 1.4 de la aplicación Android, ya disponible.
  • Recibe en tu teléfono las notificaciones de tu escritorio. Esta funcionalidad está deshabilitada por defecto, ya que puede llegar a incomodar un poco. Deberás habilitarla tanto en la aplicación de Android como en los Ajustes del Sistema, y podrás escoger qué notificaciones notificar al teléfono y cuales no.
  • KDE Connect gana en un cifrado más robusto, cambiando del método RSA a TLS. Esto además de proteger la comunicación entre ambos dispositivos también mejora la rapidez y consumo de batería que es menor.

Recuerda que para utilizar esta aplicación debes tenerla instalada tanto en tu ordenador (ya expliqué como hacerlo para openSUSE leap 42.1, en otras distribuciones viene de serie) y también en tu smartphone descargándote la aplicación desde F-Droid o la Play Store.

Como vemos, muchas novedades en una de los más afamados proyectos de la Comundidad KDE, en parte fruto del trabajo que se realizan en los Sprints de Randa.

Más información: Albert Vaca’s blog | Víctorhck in the Free World | La mirada del replicante

August 28, 2016

If you want to know what we did in KDE PIM in the last year and what we are planning to do and achieve in the next one come to my KDE PIM Status Report talk on Sunday at 1 PM. If you want to get into more technical details and discussions about KDE PIM there will also be a KDE PIM BoF session on Monday afternoon.

I'm going to Akademy 2016!See you in Berlin!

 

El desarrollo del escritorio Plasma 5 de KDE sigue avanzando, tanto en sus funcionalidades como en su diseño. En la próxima gran actualización tendremos otro caramelito visual: carpetas multicolor para el próximo Plasma 5.8 que se adaptaran al esquema de color que tengamos seleccionado. Esto no para.

Carpetas multicolor para el próximo Plasma 5.8

Según podemos leer en KDE and Linux el equipo de diseño está trabajando en una nueva funcionalidad visual que hará las delicias de los amantes de la personalización: el tema de iconos oficial Breeze tendrá carpetas multicolor para el próximo Plasma 5.8.

Esto significa que el mejorado diseño de las carpetas será como el de la imagen inferior:

Carpetas multicolor para el próximo Plasma_01

¿No veis ninguna novedad? Claro, es que es una captura con el esquema de colores por defecto. Pero si jugamos con dicho esquema cromático las carpetas nos cambiaran de la siguiente forma:

Carpetas multicolor para el próximo Plasma

 

Es asombroso, ¿no? Esta funcionalidad no es más que una extensión otra que ya tenemos desde Plasma 5.6, pero que lamentablemente nunca comenté en el blog: los iconos monocromáticos de Breeze cambian según el esquema de colores.

Carpetas multicolor para el próximo Plasma_04

En otras palabras, si elegimos un esquema de colores donde la letra es azul, los iconos monocromáticos Breeze cogerán ese color para mostrarse en pantalla.

Este cambio se produjo por la necesidad de mejorar la visualización de estos iconos. Ahora muchos iconos Breeze son monocromáticos, por tanto se necesita mucho contraste para que la visualización sea óptima. Esto llevó a añadir una nueva propiedad a los iconos y como consecuencia de ello supuso varias mejoras en el tema visual Breeze:

  • Reducir dos tercios del código
  • Mejorar la velocidad de renderización de los iconos
  • Reducir el peso del pack de iconos de 28 Mb a 9,4Mb.

Como vemos, se sigue puliendo el aspecto visual respetando las personalización de Plasma 5.

Más información: KDE and Linux – Rainbow Folders | KDE and Linux – Performance update for breeze icons

Weee akademy

QtCon starts on this Sept 1st and I will be traveling to Berlin on this Tuesday. This is second Akademy I am attending after Akademy 2014 in Czech Republic. During QtCon I am giving presentation titled Plasma Mobile: what we achieved in year in room A04 on 2nd September. And also I am going to moderate a Student Presentations where students taking part in various mentoring programs in KDE, like GSoC, GCI, SoK, or OPW.

Also I am taking part in various BoFs starting from Sept 5, like Plasma BoF, KDE SoC, and KDE Neon BoF.

Looking forward to productive QtCon! See you in Berlin! :-)

August 27, 2016

I’ve done a facelift to my website. The new version is more mobile-friendly, modern-looking and quite a departure visually from its original look. I’ve chosen for a newpaper-like, typography-based responsive layout. My site finally also supports SSL, thanks to let’s encrypt.

Going to Akademy
Next week, I’ll be going to Akademy, which is co-hosted with QtCon. As usual, my focus will be around Plasma-related topics. I’ll also hold a presentation about a KDE software store.

In less than a week QtCon will start and Randa Meetings will be part of it. On Saturday at 16:30 you’ll get an interesting story about 7 years of Randa Meetings. Mario Fux will talk about his experience organizing the KDE Tech Summit in the middle of nowhere in the Swiss Alps.

Besides Mario you will have the opportunity to meet and talk with the president of the Randa Meetings association: Simon Wächter. Don’t miss this chance and get a second opinion and maybe some insights.

And now on a different subject, which is something we might have as one of our main topics next year: Accessibility and Personal Information Management (PIM). Accessibility doesn’t just mean to make our software accessible for people with various disabilities, but also to make our software accessible on different operating systems, different devices and with different user interfaces (graphical with keyboard and mouse or touch or speech and other senses). Regarding PIM I don’t think you need more information about what this is: e-mail, contacts, events and all this synchronized and stored in a safe way.

For the case you think it’d make sense to bring another group or topic to Randa as well don’t hesitate to contact us: #randa on IRC, mail, QtCon or via snail mail (see randa-meetings.ch for details).

So if you can contribute something to these topics, work on Free Software in these areas and want to come to Randa next year: please join the date selection Doodle for 2017.

I'm going to Akademy

Flattr this!

Hey,

I’ll be at QtCon and Akademy in Berlin, mostly as observer, I guess 🙂

I’ll have also two workshops/discussions on Tuesday 6th (see [1]).

Music player

For some time now, there were threads [2] about designing a new music player. The VDG people came up with a vision and some first design ideas [3] and I built a first specification page on the community wiki [4].

I’ll do a workshop which will deal about:

  • what will make this player not just another player
  • feature discussion
  • architecture design
  • use of the public libraries of plasma-mediacenter
  • …what you want to add…

The first 3 points will be mostly a presentation of what I want to do, with discussions about how it can be done better.

I already wrote some code, but it was mostly to make some experimentation, the project was put on hold until this Akademy session, in order to start on the right basis.

Documentation and KApiDox

I also reserved a slot for KApiDox, the program that generates the api.kde.org website. The codebase changed a lot lately, and more and more projects are being generated. However, it’s still not very robust to errors (the whole process would break instead of just ditching the error source) and it appeared not to respond to every usercases.

If you think our API documentation is important and can/should be enhanced, please join the discussion so that I can know your needs (as API user or API writer) and enhance the whole thing in the future.

Links

[1] https://community.kde.org/Akademy/2016/Tuesday
[2] https://forum.kde.org/viewtopic.php?f=285&t=122273
[3] https://community.kde.org/KDE_Visual_Design_Group/Music_Player
[4] https://community.kde.org/Playground/MediaPlayer



August 26, 2016

After some time of activity on KBibTeX's master branch, I finally returned to the stable branches to push forwards some releases.

Read more... )

comment count unavailable comments

Colorpick is one of my little side-projects. It is a tool to select colors. It comes with a screen color picker and the ability to check two colors contrast well enough to be used as foreground and background colors of a text.

Contrast check

Three instances of Colorpick showing how the background color can be adjusted to reach a readable text.

The color picker

The color picker in action. The cursor can be moved using either the mouse or the arrow keys.

I wrote this tool a few years ago, using Python 2, PyQt 4 and PyKDE 4. It was time for an update. I started by porting it to Python 3, only to find out that apparently there are no Python bindings for KDE Frameworks...

Colorpick uses a few kdelibs widgets, and some color utilities. I could probably have rewrote those in PyQt 5, but I was looking for a pretext to have a C++ based side-project again, so instead I rewrote it in C++, using Qt5 and a couple of KF5 libraries. The code base is small and PyQt code is often very similar to C++ Qt code so it only took a few 45 mn train commutes to get it ported.

If you are a Colorpick user and were sad to see it still using Qt 4, or if you are looking for a color picker, give it a try!

Time flies, truly. With the end of this month comes the end of such an amazing programme
"Google Summer of Code 2016".
As planned earlier I have successfully implemented GSL library to construct histogram for both static as well as dynamic input values.

                                   
                                       
                                   
                                 
                               

                                 

Now, a user can
  • Draw different types of histogram for a given set of values.
  • Change the method of selecting the bin value.
  • Change background and filling of histogram graph.
  • Alter the properties of histogram graph scaling (auto-scale. color, pattern).
  • Mark the magnitude of bin ( individual or cumulative).
  • Draw/plot more than one curve on same worksheet to enhance the analytic study/comparison.
Though the programme has come to an end, I will continue contributing in KDE community and LabPlot. I would like to thank all my mentors in Labplot.
Working demo: https://www.youtube.com/watch?v=XHjYvmInXY

Thanks for reading :)

Today we are officially publishing the first stable release of KDE Connect. Hooray! This version is the most solid yet feature-packed version we ever released. It’s been in development for a year now and it took a lot of hard work, we hope you like it!

New features

  • Trigger custom commands from your phone

Pre-configure commands  in the KDE Connect desktop settings so you can trigger them from your phone. Use it to extend KDE Connect’s functionality to suit your needs!

Android screenshot with list of commands

  • Reply to SMS messages from your desktop

Probably the most awaited feature ever! Now when you receive a text message notification on the desktop, a ‘Reply’ button will allow you to text back without having to use your phone at all. Note you will need the version 1.4 of the Android app for this to work, already available, as we had to ask for a new permission for it to work.

  • Receive desktop notifications on your phone

Contributed by Holger Kaelberer, this is the counterpart of the phone-to-desktop notification sync we already had. It might be a bit spammy sometimes, so we decided to ship it disabled by default. Make sure you enable it both in the Android app and the System Settings module if you are interested in this feature. From the plugin settings you can choose which notifications you want to forward to your phone and which not.

  • TLS encryption

Thanks to the Google Summer of Code project of Vineet Garg, KDE Connect now uses TLS sockets instead of RSA private-key encryption. This is not only safer against replay and man-in-the-middle attacks, but also faster and less battery-consuming to compute on your devices. Like SSH, we do trust-on-first-use (or TOFU, which sounds funnier) of the device certificate, and we have added a command line option to allow you to check the certificate fingerprints match on both ends.

Android screenshot with the 'Encryption info' dialog

How to update?

If your favorite Linux distribution doesn’t release an update for KDE Connect 1.0 soon, please contact the distro packagers and let them know you want it! If you are familiar with building software from sources and can’t wait for your distro to package it, you can always build KDE Connect 1.0 from the sources available on download.kde.org.

While the Android app is backwards compatible with desktops running old versions of KDE Conect, the just released desktop version requires you to use the version 1.0 or newer of the Android app. Since we have seen that Android updates reach final users much faster than their desktop counterparts, this shouldn’t impact your ability to use KDE Connect. Just make sure you are using a recent version from F-Droid or the Play Store.


Freelance artist Nikolai Mamashev has launched an initiative to create an animated version of the open source webcomic “Pepper & Carrot” by David Revoy.

Many Krita users are already familiar with David Revoy and his work. His comic is produced mainly in Krita and all .kra files are freely available online for reuse and to make derivative works.

Nikolai’s idea is to bring David’s comic into new media by animating one episode of “Pepper & Carrot”.

For this work Nikolai will also be using Krita to convert static webcomic images into separate layers and to do all additional painting (which he expects there to be a lot of).

pepper1
The animation work will be done in Blender, with the assistance of the CoaTools addon. Also, he plans to use Krita to create some frame-by-frame animation elements (lipsync, complex movements etcetera). The rendering management will be done with RenderChan.

Nikolai has published a video demonstrating the first two animated shots:

To make this project possible, Nikolai has launched a crowdfunding campaign. If this campaign is successful he will be able to create an animated version of Episode 6 of Pepper & Carrot, “The Potion Contest“!

The result is going to be published under the Creative Commons Attribution-ShareAlike license, together with all sources.

Support animated Pepper & Carrot!

David Revoy says… “Nikolai draws better than I do! Support this work!”

 

The 2016 Krita sprint has finally begun in the beautiful city of Deventer, Netherlands this weekend. Artists, developers, testers, designers, and documentation writers are gathering from around the world to learn from each other and help define the future of Krita. The last big Krita meeting was in 2014, and now we’re meeting again!

Most of us the people at this year’s sprint are volunteers. Only Dmitry is working on Krita full-time and Boudewijn part-time. It’s good to have a real-life meeting when we can see each other’s faces, have lively discussions, and enjoy  meals together.

People started arriving in Deventer yesterday, August 25. The weather is tropical for the first time this summer. We moved into a twelfth-century cellar under Boudewijn’s house where at least it’s cool! Usually the cellar is in use as the coffee room at the Orthodox Church — but not right now. There are plenty of space, tables, coffee cups and glasses for all of the sprint participants. Add internet and we have an instant hacking room!DSCF5823

On the first day these were some of the topics we discussed:

  • how to manage releases
  • approaches to porting the ODG-based vector objects to SVG
  • which parts from SVG2 we’re going to need — text is the most important thing
  • animation workflow improvements
  • final evaluations for our Summer of Code students

Just imagine this: Krita 3.0.1 will already have the first results of the Summer of Code work done by Wolthera — soft-proofing!And of course, there was dinner, and then more hacking!

DSCF5826

We will be having a couple more people show up tomorrow. Some of the conversations won’t begin until tomorrow where things will really pick up speed.


Older blog entries


Planet KDE is made from the blogs of KDE's contributors. The opinions it contains are those of the contributor. This site is powered by Rawdog and Rawdog RSS. Feed readers can read Planet KDE with RSS, FOAF or OPML.