Making a modern looking PDF paper

The normal output from the Lyx textprocessor uses only the Pdflatex program which is outdated today. It is very slow, doesn’t support Unicode and has problems with modern fonts. The better alternative is:

1. Export from Lyx as Xelatex
2. Change the latex file and insert the following command:
3. compile:
xelatex paper32.tex
bibtex paper32.tex
xelatex paper32.tex
xelatex paper32.tex

After compiling the file with xelatex, the most recent Opentype fonts are used. In the PDF document they are shown as Truetype Font, which is the correct name. The above explained strategy replaces form LaTeX best-practices like the Metafont program (used in the beginning), and Type1 fonts (not unicode ready). Using Xelatex together with opentype is not a reinvention of LaTeX but it’s evolutionary development.

The bad news is, that the FreeSans font doesn’t support chinese characters. But that is no problem, because scientific documents are always written in English.


Windows is like the Commodore 64


There was a time in the 1980s, in which the Commodore 64 was a great piece of hardware. His main advantage was it’s price. It was the first time, that a normal consumer had the opportunity to buy a computer. He didn’t have to be part of the university system, he didn’t have to be an amateur electronics, instead the Commodore 64 was sold in the same store like washing machines and TV sets.

Today it is possible to write for the same machine programs. The best way for doing so, is using the CC65 crosscompiler. The programmer creates in C a program whichs is compiled to Assembly, and it is even possible to write programs for the GEOS operating system. But makes this any sense? No, from a technical point of view, the C-64 is no longer interesting. Because he is no longer sold, his RAM is too small, and there is no community who is interested in the new created software. The main lesson of the Commodore 64 was, that any product has a lifetime, that means, from around the 1990s on the Commodore 64 was only computerhistory, but not a working system.

Let us take a look at the MS-Windows operating system. Like in the Commodore 64 area, there was also a time in which Windows was great. And yes, Windows can be called the legitimate predecessor to the C-64. The PC-revolution of the 1990s results into cheap hardware like the C-64 but with more capacity. The 1990s was the first time, that the normal consumer was able to buy harddrives with more than 100 MB, and highres-graphics card. In a direct comparison, the IBM PC and his clones had a better price-ratio than for example the CMD harddrive for the C-64. So the PC revolution was also a poor-mans-computer.

The cheapest operating system in this time was MS-Windows. But like the Commodore 64, the product had only a limited lifespan. It was replaced by the Linux operating system. And like in the lesson before, it had to do with a better price-ratio. Driving a Webserver with Windows was possible but very expensive, doing the same with Linux was no big deal. After the C-64, the Windows operating system was the next system which was no longer an active maintained product but computer history.

From todays perspective it is possible to give both a second life in a so called emulator. It is for example possible to boot MS-windows in kvm and writing software for it. Other people are going a step forward, and have programmed a Windows clone, called reactos. Reactos is similar to the freedos project which is also a look back in the good old times. The question is: make it sense, for writing software for freedos or reactos? Propably not, it is like the C-64 an outdated technology.

For example, we can ask the freedos programmer and the reactos programmer what their wishes are for the future. Do they really want to program an operating system which is used on real hardware? No they won’t. Freedos and Reactos are invented as legacy per default, that means they shouldn’t replace Linux, instead the project can be seen like a C-64 emulator, which is created with the idea in mind to save the past and explain the future generation, how the system looks like. Comparing ReactOS with Linux makes no sense, Linux sees themself as an operating system for todays need, while ReactOS sees themself as an emulator for outdated technology.

Until now, i ignored the number of users, MS-windows has today. According to the latest count, at least 2 billion people worldwide are part of what they call the Windows community. That is a group of amateurs, who is using MS-Windows as their main operating system with the aim to install games and even wordprocessors on it. According to their self-destruction the community is healthy and stable. But in reality, MS-Windows user didn’t recognized that the time has changed. Since the 1990s there system is outdated, but they didn’t recognized it. In theory, it is possible to ignore the reality and using MS-Windows like a C-64 forever. But everybody who is interested in learning something about computers will recognize fast, that MS-windows has internal limits which can’t be overcome. That means, there is no certain bug in the software, but the overall system is broken.

Microsoft has published recently the so called “Windows Subsystem for Linux” (WSL). The marketing campaign was started with the aim, to bring back former Linux users to Windows. From the self-awareness of the MS-Windows community, there system is the future, and Linux is called nearly death. The reality is, that Windows Subsystem for Linux is trying to update Windows to the next iteration, which won’t work. Because the next step after the current WSL would be to run not only text-application but Gnome apps. And the next step after it, is equal to publish parts of the Windows sourcecode under a GPL license and so forth. WSL is not an improvement of Windows, it will be his end. MS-windows was never created as an opensource project, instead the community will collapse on the trial to imitate Linux.

What we see right now is the maximum in MS-Windows marketshare. Microsoft is like Commodore before top of the pops. They believe they are unbeatable. But in reality, they have lost the war. And the so called Windows community doesn’t even recognized the problem. They believe that the number of newly published games for the Windows operating system is a clear indicator, that the software-market is doing well. But in reality, nobody want’s to produce software for Windows. The programmers have other plans, than crosscompiling the Linux apps for the Windows market. What the motivation behind a programmer is, is simply: he wants to learn something about technology. And he do so with a system which supports his progress. Windows supports nobody. It is not possible to learn on the platform, especially not how to program future applications.

Short history of Unix

Unix is around 50 years old. In the 1970, Dennis Ritchie wrote the C programming language and programmed the first Unix operating system in 40kb. C was a replacement for assembly-like programming language and is very efficent. The 1980s was the are of commercializing Unix. That means, the former university research project migrated into a business. No longer computer scientists are programmed the software but companies like SUN (Stanford University Network). The 1990s was dominated by the invention of Linux. Linux is not a technical invention because it used the same C programming language and the same Unix specification, but it is legal idea to reduce the costs further. The typical SUN workstation of the 1980s costs around 30k US$ while the typical Linux computer of the 1990s was around 3000k US$ and the software was for free. The 2000s was the decade of the internet. That means, Linux based server and clusters powered the internet, and the OpenSource ecosystem was recognized in public.

The bad news is, that Unix is today no longer a computer science project. At least since the 1980s, and more obvious since Linux, it is driven by money, or more specific about the idea the reducing the costs of computing. Torvalds is not a researcher and he is also not a programmer, but a lawyer. That means, he is an expert for copyright law, and patent protected sourcecode. It is not possible to see Linux as part of the hacker community or computer science research, it is something different.

But if Unix is no longer intersting from a scientific perspective, what has replaced Unix there? Let us go back into the 1970s. In this time, operating systems, timesharing and GUI development was an academic topic. That means, the work was done at universities. Since the 1970s on, the amount of work done at the universities is reduced, and today nobody in Stanford or MIT is interested in Unix anymore. Instead, computersoftware is a commercial product, sold by Microsoft, Redhat and Google Android. It may be look surprising for the oldschool hackers, that Unix is no longer the epicentre of future development. But the problem is, that everything was figured out. It is clear, how to write a compiler for a programming language, and what an operating system must do. It can be compared with the invention of the first steam engine. In the beginning, steam was a research subject, but later normal companies are producing the machine.

Crosscompilation from Linux to the world

Linux is not yet-another-operating system, it is the central hub for programming software which has to be run under other operating systems. Suppose we have a hello world program in C++ and want to compile this for different target platforms. MS-windows, Mac OS X, iPhone OS, Android, Playstation. The normal way is to install first a virtual machine for every of this target. Installing MS-Windows in kvm is easy, Mac OS is more difficult but possible, iphone os is really hard and Android is easy. The reason why it is so complicated has nothing do with Linux itself, it is the fault of the counterpart. That means, it is easier to install a opensource operating system in a virtual machine, than a closed source system.

Now we make the other way around. We are using MS-Windows as main operating system and cross-compile there a .elf binary for Linux. This can be done very easy. We need only a working Fedora in virtualbox, which is perfect documented and program for the Gnome GUI a app. Only the other way around do not work very well. That means, installing in a virtual machine a proprietary operating system is very difficult. Even older systems like the SuperNintendo game console runs not smooth in such environments. The same is true for Windows or Mac OS.

Can the Linux community improve the situation? No, with kvm and qemu everything is great. The problem is, that Mac OS X and Windows are not fully documented and the sourcecode is not available.. That makes it difficult to crosscompile an application for it. The best comparison is perhaps the above mentioned Nintendo SuperNES. It was designed with the idea, that apart from Nintendo nobody other should program software for it. That means, the technology is not cross-compiler friendly. Mac OS X is very similar to Nintendo SuperNES. The system itself is easy to use, and most people love the plattform, but writing software for it is a nightmare.

Let us investigate cross-compilation from and to Mac OS X in detail. If somebody has an imac and wants to program software, which runs in Linux, he will get no problems. He simply installs a Linux distribution and writes for this target the software. The iso-image for Linux can be downloaded from the internet without any costs, and compilers too. But the other way around is difficult or even not possible. Because installing Mac OS X in a virtual machine is not supported by Apple. The code is closed source. And cross-compiling software for it, is also forbidden because Apple alone decides who is allowed to create software.

That is surprising, because in theory Mac OS X and Linux are both unix-compatible. They are using a similar kernel and are both used for professional programming tasks. The different is, that Linux was designed as open per default, that means it is easy to write software for it, while programming software for the Mac OS X target is hard.

The problem with Mac OS X, MS-Windows and iOS is, that all of them are created as gaming-console. Similar to the Super Nintendo Entertainment system it is a closed platform which can’t be emulated on other systems.

Reliable Linux statistics found

Estimates the number of Linux users worldwide with 100M. The assumption is that 2,6% of all websites are visited with Linux based browser, and the number of total Internet users is 3 billion. In my opinion the estimation of 100M is too high. Because, about Fedora it is known that worldwide around 1 million installation are used. In reality, the number of 2,6% is too high, because most Linux users have installed Windows and using Linux only in a virtualbox for testing out how Ubuntu works. I would guess that real Linux users which booting the kernel on physical hardware are around 5 million worldwide and the rest a virtualbox users.

Linux migration is done by Microsoft

Some Linux fanboys see themself in the obligation to defend OpenSource software against MS-Windows. Other are trying to increase the marketshare of Linux on the desktop. But I think, the Linux community must do nothing, because Microsoft makes the best advertaisement for Open Source alone. The current strategy is remarkable Pro Linux oriented. For example, they have written the monodevelop-system for programming C# apps under Linux, they want to make Ubuntu as a first class guest in a hypervisor, the have invented the Linux on Windows subsystem for running commandline Linux apps in Windows and so forth.

If we are thinking this into future, perhaps in some years, Microsoft will decide that their users should run Gnome-apps nativly in the Windows desktop because they have integrated the GTK+ framework there? It seems that the pressure to Microsoft is high to be open to Open Source. I would say, the Microsoft can be called a forerunner in Linux marketing. If they are doing the same strategy also the next years, they will motivate more users to try Linux on Windows.

Let us watch how a typical Windows user is experiances Linux. He downloads the for-free tool Virtualbox from the oracle website and boots in the virtual machine the Ubuntu .iso image. He tries out some functions and after a while he decides that WIndows is better. So he stays on Windows. Is something wrong in this game? No, everything is perfect, because it’s up to the user to decide. If he likes Ubuntu this would be nice, if he think that Windows is better it is also ok.

The main advantage of Linux is, that it is superior in any case. It has a marketshare on Supercomputers of around 100%. It has a marketshare on major webservers also at 100%. It has a marketshare on Smartphones with around 80%, and every good programmer worldwide thinks that Linux is better than Windows and Apple combined.

I do not think that Linux is some kind of race against Microsoft. No, Linux has won the race, Microsoft must follow Linux. The enemy is somewhere else. The enemy is, that Linux is not paranoid enough. The pinguin has obesity. The main idea behind opensource software is, that it is per default and per definition superior against propriatary software. If a product costs nothing, it is not possible that this product will loose any challenge. The question is now: what is the next challange for Linux? Getting a high marketshare on the desktop is no real challange, because Microsoft will solve the problem alone. I think the next big challange is Open Hardware. Today, every Linux installation works with propriatary cpus. The reason is, that chipdesign is very complicated and was postpone in the past. The next big step for Linux is to extend the idea of Open Source to Open Hardware. SiFive has launched the RISC-V cpu recently. I think, here is the next step to bring Linux forward.

The enemy is not Microsoft. Today Microsoft is a big Linux fan, because they have no choice. In the last interview, some programmer have even speculated to make parts of Windows opensource. If that is no victory for Linux, what then? No, the next challenge is to beat the hardware industry. Players like Intel, ARM and AMD are selling closed source hardware and this has to overcome.

Migrating all people to a Linux desktop is simple. From a technical point of view, the desktop is ready. The current Ubuntu distribution or the RHEL Workstation edition runs great. The reason why the people are not migrating has nothing to do with technology. That means, Linux runs stable, has no major bugs and can’t be made better. But, the reason why people are using Intel cpus and not Risc-V is technical based. That means, even if they want to migrate, they can’t because of too many problems.

Switching from Windows to Linux on the desktop is easy. A simple USB bootstick transforms every Notebook in around 3 hours into a firstclass Linux workstation with all programs the user need. But switching from Intel to Risc-V can’t be done in such an easy way. Because the original RISC-V development board costs around 1000 US$, that is more than an Intel CPU, and the pathway for transition is not documented. So it can be called unexplored land. I think, this will be the next competition.

In contrast the aim of bringing Linux to the desktop is useless. Because, the marketshare on other devices like smartphone and server is today high enough, and the number of desktop pc do not count, secondly the task is technically solved because the current gnome GUI runs on every PC great, and third in most cases the migration failed because of the users itself. They are prefering Word and their commercial games. Even if all bugs in Linux are fixed on day, this will bring Linux not to the desktop.

If a migration project failed because of technical problems, for example, that Linux has problems to boot on a certain PC model, than it would make sense to invest energy into the problem to make the system more robust. But as far as I know, Linux runs great under 99,9% of all hardware. Even the installation of an printer is easier than under MS-windows. Everything what the user has to do is plug in the usb-cable and the driver will be installed automatically. So, on what part of the software the programmer should fix the bug? Right, if the technology works, than the game was won.

It is important to find a new challange, that are unfixed problems which are important and motivate people to invest their time. If in 10 years the RISC-V cpu is on the same state like today, and it is technically not possible to run a Linux kernel on it, than the community has done something wrong seriously. If in 10 years the marketshare of Linux on the desktop is on the same level like today at under 1% it is not a problem. It is not the fault of Linux.

Linux distribution for education vs. workplace

It is possible to compare Linux distributions against each other. For example, a battle between ArchLinux and Debian makes sense. Before doing so, it is important to classify the candidates into two groups: distributions for education purpose and distributions for production environment. In the category education the following distributions are part of it:

– Linux from Scratch
– Damn Small Linux
– ArchLinux
– Debian, Ubuntu
– Gentoo

In the category production / workplace the following items can be part of it:

– RHEL, Fedora

I think it makes no sense, to compare distributions from different groups, for example RHEL vs. Linux from scratch. It is only possible to compare distributions from the same group. Let us investigate the teaching distributions in detail. The Linux from scratch project is a very good example. It contains primarily a guideline, the LFS book, which is translated to German, Chinese and Russian. If somebody is interested in understanding not only the Linux kernel itself, but also the distribution around it, the LFS book will be his friend.

Another example for a good teaching distribution is ArchLinux. The main contribution is a package manager called pacman, which is heavily documented in a wiki The other projects on the list are Gentoo and Ubuntu, which are also well documented and for every user who is interested in learning Linux it is a must have to take a short look into the manuals.

Can one of the education Linux distributions be used as a productive machine? The answer is simple: No! LFS, Ubuntu and Gentoo were designed as a manual-only distribution. It makes sense, to read the wiki, testing out the distribution in a virtual machine, and trying out to compile a package for its own. For example, in ArchLinux with pacman it is easy to compile to whole Linux kernel from sourcecode into a binary file which can be booted.

But, using any of the teaching distribution as a real server is ridiculous. The reason is simple: a Linux distribution must be updated regularly. ArchLinux, LFS or Debian do not provide any update functionality. And if they are doing so, it will break the system. That means, after updating the system it is possible, that the machine no longer boots. For educational purposes that is no problem, because the operating system is run entirely in a virtual machine and fixing problems is part of the lesson. But for a reallife-server / -workstation it is not an option.

I think, it is the same problem like in the Minix vs Linux debate. Minix is a wonderful operating system for teaching students what UNIX is, and how they can program their operating system from scratch. Minix is for that purpose superior to Linux. On the other hand, Minix is a bad choice, if somebody needs a LAMP-like server as a backbone for his infrastructure.

In the category of Linux distribution for productive usage only 2 are in: RHEL and Suse. Nowadays, Suse Linux is no longer important. The company was bought by Novel and they have the same package system like Red Hat, the RPM format. So in reality, only one Linux distribution for productive usage is out there: RHEL. The current version has the subtitle RHEL 7.5 and it can be used for productive server and workstations. The disadvantage of RHEL is, that the user can’t learn anything from it. They have no outstanding wiki or book online which explains, what a Linux distribution is, and how the user can build it from scratch. What Red Hat has instead is an online shop, where the user can type in his creditcard number. On the other hand, the RHEL distribution is very stable, that means the server can be used for mission critical applications.

The main problems are happing if the Linux distribution is used for the wrong purpose. For example, if someone wants to install Linux from scratch in a company for driving the webserver. From a technical point of view, it is possible, the server will run. But after a while some major problems will occur, which can’t be fixed. The problem is not LFS itself, the distribution is fine. But it wasn’t made for such purpose.

A different kind of problems takes place, if someone will teach Linux in a class and uses RHEL as demonstration how to build a distribution from scratch. The students won’t understand it. Because RHEL is too advanced, it has too many packages, and behind the distribution itself is a company which has a certain strategy and customers. Instead of using RHEL for teaching, i would recommend Minix-VMD and Knoppix.


In my opinion, Linux from scratch, Debian and ArchLinux will have a great future. In the niche of a teaching distribution they are wonderful projects. In a recent comparison LFS was described as follows:

“This is no doubt a large undertaking. LFS isn’t entirely feasible as a production system, it’s more of an educational adventure.”

The same is true for ArchLinux or Gentoo. The open question is, if it’s possible to use Linux apart from a teaching experience for more. For example as a real server. The problem is, that LFS and Gentoo are not suited for that purpose. It is not possible to develop the projects into production usage. The projects would lost their identity. It would destroy the idea. In my opinion, it is better to create a dedicated production distribution. That means a distribution without any philosophy or documentation, but with working packages which are distributed in binary format. The funny thing is, that nobody has to decide for one of these purposes. It is able to use RHEL for running a PC, and Linux from scratch for learning about it. And perhaps this can be called a best-practice method. If new users are installing first Fedora on their machines, and inside the qemu virtual machine, they are playing with LFS and Ubuntu for improving their knowledge.