Det rör på sig i utgivarlandskapet! Nu senast inom verk från den akademiska världen! Det som länge ansetts vara för specialiserat och svårtillgängligt börjar nu öppnas upp på nya sätt! Rounded Globe är en ny, kuraterande utgivare för akademiska publikationer. Det finns redan ett gäng intressanta publikationer att hämta hem, bland annat J.R.R. Tolkien’s Lost English Mythology av Simon J. Cook som förklarar Tolkiens livslånga projekt för att rekonstruera forntida nordiska traditioner.
Initiativet Rounded Globe utmanar konventionella metoder, affärsmodeller och normer med öppenhet som verktyg och resultatet är mäkta spännande. Jag ser verkligen fram emot att följa utvecklingen av denna tjänst och all den support för skrivande akademiker som följer med för att göra deras arbete mer lättillgängligt och öka räckvidden. Detta gör dem genom copy-redigering, programmering och omslagsdesign. Utgivaren i sig är inte en del av #OpenAccess-rörelsen utan fokuserar på att kuratera de bästa texterna från vetenskapen.
Rounded Globe som utgivare använder sig av plattformen Patreon för donationer. Patreon är ännu ett nytänkande tillvägagångssätt för att stödja artister, organisationer & projekt. Om du finner Rounded Globe bra och intressant kan du stödja dem via Patreon på denna länk!
Vad kommer härnäst att förändra utgivarlandskapet av kuraterad text? Jag ser fram emot att följa utvecklingen och strömmarna!
En av de mest eftertraktade sakerna när det kommer till att surfa på Internet är att webbsidan laddar snabbt vid första besöket. Valet av webbläsare anses ofta vara boven när det går långsamt. Valet av webbläsare är viktigt (rekommenderar Mozilla Firefox) och upptäckte att det är mycket annat som spelar en roll. Ett svar jag fann är Varnish, en plugin som är Fri Programvara och Öppen Källkod-baserad som jag installerade på min blogg.
Som en del utav detta projekt av att utforska öppna lösningar, förstod jag att jag behöver leva som jag lär, i det dubbla syftet då jag också vill lära som jag testar på att leva. Därför strävar jag efter att implementera funktioner och de fördelar som finns med Fri och Öppen Programvara. Varnish och dess plugin är ett sådant steg.
Jag installerade Varnish plugin väldigt enkelt från WordPress-repositoriet. Jag har inte besvärat mig med att titta djupare i några konfigurationer av Varnish ännu även om det vore bra.
Vad Varnish gör är att den sparar en sorts konfiguration för hur din webbserver ska ladda hemsidan till besökaren. Sen använder den denna för en omfattande inkommande besökare istället för att generera hela hemsidan från grunden varje gång.
Detta gör servrar mer effektiva med att ladda hemsidor genom tre lika viktiga anledningar
I och med att webben blivit mer dynamisk är det extra viktigt att hålla den & dess hemsidor effektiva. Därför är Varnish bakgrund & historia intressant utifrån ett perspektiv av frekvent innehållsskapande. Wikipedia har en bra artikel som beskriver programmet i detalj.
Dagens dynamiska webbsidor gör detta tillägg extra intressant, det har nämligen sitt ursprung ifrån tabloid-tidningen Verdens Gang i Norge. Initialt blev det utvecklat av en FreeBSD-utvecklare från Danmark, vid namn Poul-Henning Kamp. Under tillväxten av projektet fick det stöd utav Linpro (nu Redpill-Linpro), en Linux konsult-firma i Norge, också verksam i Sverige där de gjorde en föreläsning om mjukvaran. Varnish har också fått pris för sin välbehövliga funktion.
Varnish är Fri Programvara med Öppen Källkod, licensierad med FreeBSD och började att utvecklas under 2005. Logotypen är ett täcks utav dess registrerade varumärke och därför publicerar jag inte den här.
När jag hittat en fri guide och konfigurerat min installation av Varnish kommer jag skriva mer!
There are people in need for different operating systems and that I know, because I have changed the lives for some!
During the latest couple of months I have changed people’s lives by spreading knowledge about GNU/Linux and installing it on some computers for people I know. Helped them to migrate from obsolete systems, not suiting their needs, to the Ubuntu distribution which is a derivative of the old and well-developed operating system Debian, both of which recently had new releases and originating out of Linux system kernel. What are the reasons for it and why did they wish for it? Well, it all comes down to usability in these cases, where systems were not adapted to these people in different ways and I believe this is a universal issue.
Before I got out travelling I helped my grandfather to switch to Ubuntu. He has worked with computer systems back when they were only machines for businesses and were physically “large as a classroom in school”. Now he is back to revamping his experience with a totally new episode for him. The installation transformed his slow computer and the confusing system of Windows Vista, to something that started to inspire him and get interested in computers once again but as well reminded him of some downsides from his early days in the business. I wrote about my first experiences about the migration from Windows XP, when I did myself a big favour of refurbishing an unusable computer into a perfectly fine workstation, cloud storage and server.
Everything seemed to work fine when I was out travelling before the setup with last year’s latest version 12.10, Quantal Quetzal, of Ubuntu crashed. I realised that there is a lot of responsibility that comes with that initiative of being the sole tech support during and after a migration. There are so many new things that he had to get used to which we had considered. But before I left for my journey I did not manage to find time and look at an error which was popping up after his computer booted up which I think would not cause any real damage as we encountered a couple of those during the first sessions we had when I taught him the basics of the system. That was what the error did. About two weeks into my journey the computer crashed and he was left without any support to get going again which was not what I intended and something I learnt that I should have expected.
The second time it worked out really well, coming to the conclusion that the errors that made the system crash before was because of the way I must have installed the printer drivers with an error during the process. When we tried again a couple of weeks ago with the new version 13.04, known as Raring Ringtail, was released, it worked out fine. The printer was installed again and now it works perfectly fine without any errors being reported at all. In that sense he is now enjoying his sessions of doing the basic things but still having a beautiful desktop environment to learn and discover more about.
The second time around, I was asked to do an installation was when I was when I helped the mother of a friend who was more positive towards Linux, her experience and connotations with that than towards Windows. She preferred to have the system in Spanish but as Windows limits a license to one language only she had to stand an English interface even though she is interested in learning the language. However, when we had installed Ubuntu Raring Ringtail here too, it was so convenient for her, less of a fuss and simple to learn. The system language could be changed when needed in an instant with the opportunity to change keyboard layout as well and there was room for her to learn a simpler system that was cleaner and not as confusing as Windows 7 which really functioned well before. It was a matter of adaptability language and simplicity of the system, which the Microsoft Windows experience could not offer to a paying customer but that Ubuntu could offer for free.
A challenge of turning to a Linux-based operating system such as Ubuntu for an average user is that there can be fundamental visual differences from Windows such as Ubuntu’s Unity-interface, which in turn is a very similar experience to OS X and its Dock-function. As a frequent user of both OS X and Windows operating systems I felt it was a minor step but then again, I am very interested in discovering and learning about this which might not be as simple for all. However I found many of the things much simpler within the Debian-based system Ubuntu. There are still flaws with the systems, one thing noticed with both setups I turned to the latest version is that the user interface still acts quite slow, although these computers had rather low-performing graphical units which possibly is decisive. However there is a problem of giving technical support when I am physically disconnected which requires of the user to have a backup support solution. I have a couple of ideas on this which I aim to develop.
Yesterday I was offered to play a bit with a HTPC media computer station and more specifically to install any kind of suitable distribution of GNU/Linux. This friend of mine has barely used the hardware although it seems to me like perfectly fine goods but this time there has been typical user experience of Windows that affected his satisfaction and that was all the “bloatware” and their annoying popup-messages along with the built-in Windows pop-ups which made the computer slower than necessary and unmotivated his will to use it at all. Feel free to comment below if there is any specific setup for media center purposes that is rather easy to set up and manage for a beginner as I aim to offer it to someone relatively new to the idea of GNU/Linux.
I find a lot of joy in helping others about these issues as I am gaining a lot of knowledge and spreading these systems, philosophies and mindsets which come with it. I call it The Windows Exodus, because that is what it has been so far, a migration from non-adaptive and non-inspiring versions of the old and obsolete NT-kernel that is the foundation of what we perceive as the PC today and still is present in the latest Windows 8. The Free Software Foundation has a rather interesting infographic about this and it has been circulating on social media I follow.
I aim to change the recent development by starting something bigger than my own capabilities. It is because I have come to the conclusion that the PC experience is lagging behind mostly because of the rather bad connotations that people have with systems with it today. This is negative for a whole industry and needs to change and I believe GNU/Linux systems can help do that.
Completely decentralized sharing of information with P2P? No need to go sites such as LegitTorrents?
Yes, one answer for the future can be spelled Tribler. This P2P (Peer-2-Peer), Bittorrent software is based on the idea that connecting to a central node in a network for sharing information can be risky and disadvantageous if something would happen to that. Thus Delft University of Technology and Vrije Universiteit Amsterdam started this as a research project in 2008 with funds from the European Union to develop this social information sharing without any central key nodes in the network.
The interesting thing about this software has also been its threshold when it comes to the first impressions and usability. When using the program it has been rather slow and unresponsive thus I have not wanted to recommend the client to family, friends and fools but in an idealistic perspective I see it as a very good source for information and material as the user base grows and the quality improves, which it is doing more often. One thing which has been troublesome when using Tribler is the collection process that the program does of torrent files which makes the search for content possible. This is described in the official forum as a function that uses a cache folder of torrent files on the user’s computer that other users can get access to start downloading as a regular Bittorrent download.
I have tested different versions of the software since 6.0.0 on the Windows- and OS X-platform. On the Windows-platform there has been an extremely slow startup when noticing that Tribler starts indexing files. On OS X there has been bugs during start-up as well but more so with the shut down of the program. A personal reflection was when I experienced how the settings were not able to be changed on the OS X and as this was reported on the forums this got solved only shortly after with the next version. The upcoming releases will focus on solving issues for some people as it is a very power-consuming process for the computer. However things have changed a lot (at the moment 6.1.0) and the software now feels quick and is a bit cleaned up graphically, with the speed of development in only a couple of months has been impressive. The downsides at the moment can be notice with the Family Filter which does not seem to work and how the mousepad scroll not functions in Tribler as it should universally do in Mac software.
Tribler is licensed with the LGPL and it opens up for anyone to help out with the project but the big advantage I see here for people who want to learn about networks and P2P networks in general is that they can scrutinize the code and see how it functions. In other terms the Universities might be not only conducting advanced and important research but also educate external people involved in the project and at the same time providing a piece of software that can be very useful in future versions.
The project is interesting in so many ways because it uses almost all of the advantages with open sourced-software. It looks into different scenarios with peer-to-peer and collaborative production and consumption, visualized through P2P-networking. With funding from different institutions and by being in the QLectives (Quality Collectives), it opens up for a lot of research on building and deploying the next generation of intelligent informational systems. The development consists of different community contributors but the fundamental team behind the project are mainly researchers from the universities behind it, such as assistant professor Johan Pouwelse and Henrik Sips at Delft University of Technology in the Netherlands and by looking at the forum it seems as the community is growing as this is a rather unique solution.
Tribler has a very simple user interface but one thing found unnecessary is the media player function which at the moment does not add any extra value to the user. The idea is that the software should be able to stream content from other users directly in the application which was tested on both platforms in different versions but never with success. However the function addresses a huge issue that the Internet faces are the content services filling the bandwidth of the web as more and more people turn to the market of commercial streaming services. A larger amount of cultural consumption goes through digital means every day and this is how Tribler plays a part in the P2P-Next project by experimenting and working with solutions for better decentralized content streaming through P2P. The project has not only been funded by the EU QLectives but also by the I-Share project, The STW, PetaMedia. Current communicative improvements would be that the project team updates their website often and increase the transparency when it comes to details in funding and where the millions of research money goes in development. The project is very interesting as a basis for commercial businesses and I am curious to see how it develops in the future.
I will definitely use this software when I get going with a distribution of GNU/Linux. It is my ideal choice for a fuller client in the future as I cheaply and quickly can become an agile distributor of software and culture. The personal channel function for distribution has been tried this with the project to spread reviewed music and films reviewed but it will be interesting with content for other agendas as well. Most interesting of all now is the aim to introduce full encryption in the project and work with anonymous sharing which is interesting both for the integrity of the user and for research purposes.
With the Tribler project, distribution of information has never been more decentralized and you can contribute by downloading it.
Two weeks ago I fetched back my old desktop computer that was collecting dust in a neighbouring house basement. I was convinced that it still was usable and living enough to run newer operating systems than the Windows XP it had been running on for the last ten years. So I brought it home and started with the configuration of the operating system by heading on to Ubuntu’s website. Ubuntu is a distribution of Linux which is under development by the company Canonical Inc. As far as I have heard it is one of the most user-friendly distributions of Linux. However it is not as commercially independent as it could be in comparison to other distributions because of its maintenance connections to Canonical Inc but logically more independent than the closed source operating systems developed by Apple and Microsoft. But it will do for now because what I am experiencing now is just fine for my understanding and comprehension of how Linux functions in comparison to OS X and Windows. What I also learned during this period of time is that OS X and Linux are siblings, based on the same system architecture named Unix, written almost 45 years ago, a sign of how old the pillars of modern operating systems are.
After installing Ubuntu through the WUBI (Windows) installer, I tried starting up the computer and realised that there were some major issues with this configuration. When I booted, I manually had to select what operating system (OS) that I wanted to run for the specific session, in my case Windows XP Home Edition or Ubuntu 12.10, a so-called dual boot. In the long run I do not think I would stand to have it like that but further issues developed which made me take even more time to address my sparse free time to solve this. When I finally was done with the installation I experienced heavy graphical issues when the screen all of a sudden turned black after I had logged in. I did a lot of research and found that it could be the graphics card that was incompatible so I followed up a forum thread and bashed in some commands and hoped it would solve the problem. It did not and I felt rather despondent. Then I thought that there was only one thing to do, look for the possibilities.
After some more research, I understood that Ubuntu was far too requiring for the old machine so I had to move on to look for alternative distributions, which I did via Distrowatch. The information about system requirements could have been much more accessible and this problem was about to haunt me again. The Linux Foundation has allowed and encouraged modifications of releases by the user community and this has for example resulted in the operating system Ubuntu, developed off Debian which is ‘the rock upon which Ubuntu is built‘ another Linux-based operating system. This means that there are interestingly enough for me low-requirement distributions such as the one I found most suitable for me as a beginner – Lubuntu, a derivative of Ubuntu which uses another graphical interface. When I started the install process I felt everything was going very well but when I got to the point of where the actual erasing and installing of the system it seemed to stop. I let the process roll on when I slept, for seven hours straight without any interruptions but I realised fairly soon it had frozen. After a lot of research and asking questions on forums and chats looking for answers I finally found the problem; my computer could not handle the system requirements, but this time for the graphical version of the Lubuntu installer which made me a bit annoyed and I went on to download the text-based installer and it was a success. But it could have been so much easier with easier access to such information which would have saved me a lot of time and energy in this very intense period of working.
However I am very glad I took this step, it is almost like I might have opened Pandoras Box. I want to dive deep into all knowledge.