Recently I changed the job, and at my new work place I've got a MacBook Pro.
Working on a Mac might be quite difficult for someone who was using PCs for 20 years. The Mac OS X UI is not bad, some things are familiar to me after being on Linux for the last couple of years. But one thing on a Mac is very annoying - it's the keyboard, especially on a MacBook.
Using this keyboard for me is like tiptoeing on a mine field. One wrong step and you're dead! Well, it's not really that fatal, but only because there is usually an Undo command available.
The worst thing is the absence of the separate Home/End/PgUp/PgDn keys on a laptop keyboard, so you need to use directional arrow keys with Fn, Control, Option or Command modificators. It is just impossible! Especially because the behavior is inconsistent in different applications.
I've already learned to use the Command key instead of Control, but those arrow keys with modificators drive me crazy! I desperately need a full keyboard!
Friday, March 2, 2012
Monday, August 22, 2011
Master password in Fennec
Last week we finally fixed the master password bug 592772. It had one of the longest history: 130+ comments, and 12 versions of the patch! A funny thing about it is that the final patch is actually almost the same as the very first version, except for some minor tweaks.
OK, some background. The need for this functionality had arisen when we implemented support for SD card, specifically - moving the user profile to an SD card when application is moved there. It was a wanted feature, because the user profile may grow to tens of megabytes (37MB was mentioned), which is quite a lot for many phones. Mike Beltzner formulated this pretty well in his comment:
To avoid that the important data had to be encrypted. Desktop Firefox already has such a feature called "master password". The key*.db file in the user profile could be encrypted with the password, so when a web site login data needs to be saved, or Firefox detects it can auto-fill the login information on a site, the user is prompted to enter "the master password for the Software Security Device".
The original goal of the master password bug 592772 was just to implement the same functionality - basically just have a preference, and prompt the user for a password similar way how it's done on desktop. But a simple prompt had some UX issues. It could come up several times in a row when more than one site requiring access to the password database was open, or in some specific cases, like in the bugs 624552 and 624570. So there came up an idea to implement a new feature, later called "auto-password". The idea was pretty simple: use the same internal master password functionality, but do not ask the user to enter the password, but rather generate it automatically, and store in the system, so the data would be protected the same way as on desktop, but without the "UX hell".
I did some research on the ways how to store the generated password, and implemented a prototype, which used AuthenticationService and an Authenticator - similar way how it is done for all Android applications logging the user in like Gmail, Facebook, etc, but unlike those, stored the password locally rather than on the server. But it wasn't the best way. Apparently the current version of Android does not have any system-wide secured storage, which could be used for our purpose, so the only place to store the password was the internal application data directory. It revealed one more important issue. Our application was marked as debuggable in Android manifest file, to make debugging easier, obviously. But that flag made all application data readable, and accessible by other applications, and through USB connection. This bit was fixed in a separate bug, turning off the debuggable flag in the official builds.
The unnecessary complexity of the authenticator approach implementation, where all its functionality was not really used, and some minor issues related to it (for example, our authentication service that was used only internally, was visible in the system settings, which we didn't really want) lead to a simplified approach to store the generated password in the application's SharedSettings, which, despite the name, are private to the application, and with debuggable option off, are not accessible by the others. Then there was another modification, with the password stored in a simple file inside the Firefox data directory. This was done to avoid an extra call to Java layer through Android bridge, as the latest implementation was all in the platform code.
The last approach to the auto-generated master password feature combined the auto-password service and the original UI preference, to allow the security-conscious users still have their own master passwords, which were not stored anywhere on the device. The feature passed the security review, with the comments that it might be useful not only on Android, but on other systems as well, where the generated password could be securely stored in a system key ring. But later during the last code review more concerns were raised by Brian Smith, with an alternative suggestion: instead of implementing a whole new auto-password feature, which would encrypt the passwords database and store the key in internal memory, just keep that passwords database itself in the same internal memory, by splitting the user profile. This was actually discussed before, but it wasn't clear how difficult it would be to implement the profile splitting.
During another security review, for the whole approach to store user profile data on the SD card, those concerns and possible ways to resolve them were discussed, and the final decision was to drop the auto-password feature, look into the profile splitting possibility, and implement the master password UI, as it was originally planned.
That was the final implementation submitted, which will be included in the next release: just a master password preference, similar to the one on desktop. The auto-password feature was moved to a separate bug 678412, with a "work-in-progress" patch attached.
OK, some background. The need for this functionality had arisen when we implemented support for SD card, specifically - moving the user profile to an SD card when application is moved there. It was a wanted feature, because the user profile may grow to tens of megabytes (37MB was mentioned), which is quite a lot for many phones. Mike Beltzner formulated this pretty well in his comment:
While I understand that instinct, and further understand that the newer generations of Android phones are not as limited as the Nexus One and its contemporaries in terms of the internal/SD memory split, I can also tell you that as a Nexus One owner, this really does prevent me from using Fennec as intended. My choices are:The feature, while helping the users to free some internal phone memory, had a big drawback - it made all the user data fully exposed and easily accessible, as the SD card uses FAT file system and doesn't have any protection. So if the card is lost or stolen, someone could get access to all personal information in the Firefox user profile, most importantly - all the stored passwords for the web sites.
- Fennec + Sync and a limited number of other applications
- Fennec w/o Sync and my applications
- No Fennec and my applications
To avoid that the important data had to be encrypted. Desktop Firefox already has such a feature called "master password". The key*.db file in the user profile could be encrypted with the password, so when a web site login data needs to be saved, or Firefox detects it can auto-fill the login information on a site, the user is prompted to enter "the master password for the Software Security Device".
The original goal of the master password bug 592772 was just to implement the same functionality - basically just have a preference, and prompt the user for a password similar way how it's done on desktop. But a simple prompt had some UX issues. It could come up several times in a row when more than one site requiring access to the password database was open, or in some specific cases, like in the bugs 624552 and 624570. So there came up an idea to implement a new feature, later called "auto-password". The idea was pretty simple: use the same internal master password functionality, but do not ask the user to enter the password, but rather generate it automatically, and store in the system, so the data would be protected the same way as on desktop, but without the "UX hell".
I did some research on the ways how to store the generated password, and implemented a prototype, which used AuthenticationService and an Authenticator - similar way how it is done for all Android applications logging the user in like Gmail, Facebook, etc, but unlike those, stored the password locally rather than on the server. But it wasn't the best way. Apparently the current version of Android does not have any system-wide secured storage, which could be used for our purpose, so the only place to store the password was the internal application data directory. It revealed one more important issue. Our application was marked as debuggable in Android manifest file, to make debugging easier, obviously. But that flag made all application data readable, and accessible by other applications, and through USB connection. This bit was fixed in a separate bug, turning off the debuggable flag in the official builds.
The unnecessary complexity of the authenticator approach implementation, where all its functionality was not really used, and some minor issues related to it (for example, our authentication service that was used only internally, was visible in the system settings, which we didn't really want) lead to a simplified approach to store the generated password in the application's SharedSettings, which, despite the name, are private to the application, and with debuggable option off, are not accessible by the others. Then there was another modification, with the password stored in a simple file inside the Firefox data directory. This was done to avoid an extra call to Java layer through Android bridge, as the latest implementation was all in the platform code.
The last approach to the auto-generated master password feature combined the auto-password service and the original UI preference, to allow the security-conscious users still have their own master passwords, which were not stored anywhere on the device. The feature passed the security review, with the comments that it might be useful not only on Android, but on other systems as well, where the generated password could be securely stored in a system key ring. But later during the last code review more concerns were raised by Brian Smith, with an alternative suggestion: instead of implementing a whole new auto-password feature, which would encrypt the passwords database and store the key in internal memory, just keep that passwords database itself in the same internal memory, by splitting the user profile. This was actually discussed before, but it wasn't clear how difficult it would be to implement the profile splitting.
During another security review, for the whole approach to store user profile data on the SD card, those concerns and possible ways to resolve them were discussed, and the final decision was to drop the auto-password feature, look into the profile splitting possibility, and implement the master password UI, as it was originally planned.
That was the final implementation submitted, which will be included in the next release: just a master password preference, similar to the one on desktop. The auto-password feature was moved to a separate bug 678412, with a "work-in-progress" patch attached.
Monday, August 1, 2011
Mozilla blog
I tried to hide from blogging about Mozilla stuff I work on, as I'm a bit shy :), but I guess I have to do it eventually. It is actually useful as a lot of information gets lost, if not recorded anywhere, so a blog should be a good place to keep that.
I joined Mozilla two years ago, in July 2009, to work on Fennec. There was a plan to make a version for Symbian, and my previous experience with Symbian could help. But the focus at that time was on Windows Mobile, and as I developed for that system as well, I started working on Fennec for WM. We were almost ready for the first release, when Microsoft killed Windows Mobile in favor to the new shiny Windows Phone 7. It turned out that the new version unfortunately was not compatible with the previous ones, as it didn't have an SDK to develop apps in C++ using the underlying Windows CE, like all the previous versions including PocketPC and Windows Mobile. That's when I switched to Android, as it became our main priority.
From time to time I work on the issues, which require a lot of investigation, and have a lot of questions raised. I will try to post here about those issues to have some coverage on what's happening behind the scenes.
I joined Mozilla two years ago, in July 2009, to work on Fennec. There was a plan to make a version for Symbian, and my previous experience with Symbian could help. But the focus at that time was on Windows Mobile, and as I developed for that system as well, I started working on Fennec for WM. We were almost ready for the first release, when Microsoft killed Windows Mobile in favor to the new shiny Windows Phone 7. It turned out that the new version unfortunately was not compatible with the previous ones, as it didn't have an SDK to develop apps in C++ using the underlying Windows CE, like all the previous versions including PocketPC and Windows Mobile. That's when I switched to Android, as it became our main priority.
From time to time I work on the issues, which require a lot of investigation, and have a lot of questions raised. I will try to post here about those issues to have some coverage on what's happening behind the scenes.
Saturday, July 30, 2011
Becoming a website admin and designer...
We've started a new business - unique T-shirt design!
My responsibility is obviously - the web site set up, support, and maintenance. It's actually pretty interesting and somewhat new experience for me.
I didn't want to start anything from scratch, because I knew there were some ready-to-use content management systems. I had a quick look at Joomla, did some research on Drupal. But they were too powerful and complicated, so I looked for alternatives. It turned out there are quite a lot of specialized e-commerce engines, and many of them are free. I liked the name OpenCart, tried it, and without much hesitation selected it as our platform. It's not too complicated and pretty usable. There are different modules and extensions available, so it was possible to customize it for our purposes.
The site is now live and working! Check it out: www.excellentshirt.com
Our first and my favorite design:
My responsibility is obviously - the web site set up, support, and maintenance. It's actually pretty interesting and somewhat new experience for me.
I didn't want to start anything from scratch, because I knew there were some ready-to-use content management systems. I had a quick look at Joomla, did some research on Drupal. But they were too powerful and complicated, so I looked for alternatives. It turned out there are quite a lot of specialized e-commerce engines, and many of them are free. I liked the name OpenCart, tried it, and without much hesitation selected it as our platform. It's not too complicated and pretty usable. There are different modules and extensions available, so it was possible to customize it for our purposes.
The site is now live and working! Check it out: www.excellentshirt.com
Our first and my favorite design:
Sunday, April 24, 2011
New PC: Video cards swapping - II
Yesterday I decided to swap the video cards back.
Regardless of the good cooler, the 9800GT was becoming too hot. Apparently there is just not enough air and ventilation - thanks again to Dell motherboard design. The video card is positioned at the very bottom, next to the only PCI slot, so there is less than an inch space between the radiator and the bottom of the main case chamber. The optional fans of the Turbo Module barely fit there, almost touching the wires. Under heavy load the card temperature was over 100°C!
So I decided to return the original ATI HD5670 back - it's a bit less powerful, but being newer it uses less power than 9800GT, so runs much cooler. I put the card back thinking I was just returning to the original configuration, but what a mistake! Well, the old PC didn't have any issues - Windows XP just reused the nVidia driver, I just disabled autostart of the Catalyst, which was complaining about the missing ATI card.
Windows 7 in the new PC didn't have problems either - again, just reused the ATI driver, and I re-enabled the Catalyst autostart.
But completely different story with Ubuntu. It just didn't boot to the GUI! The monitors stayed black and were blinking with the power indicators, which meant there was no proper signal from the video card. Booting in recovery mode and attempt to reconfigure graphics didn't help. Neither the playing with xorg.conf. I removed everything related to nVidia - no luck. Removed ATI drivers as well - nothing... Then somewhere in the logs I noticed mentioning of VMWare and some warnings. I remembered that VMWare gets pretty deeply into the system, it even compiled some kernel modules during installation. So I decided to give it a try. Found how to uninstall it (followed instructions on this page: sudo vmware-installer -u vmware-player), and then after more playing with recovery mode that the system finally could boot successfully into X.
It wasn't the end though. There was no network connection! Apparently when VMWare was ununstalled, it took the network with it. More googling helped to find that "eth0" is not in /etc/network/interfaces anymore. Just added it to the first line ("auto lo eth0"), rebooted, and voilĂ - it worked!
Now I have just reinstalled ATI drivers, and re-configured the multi-display mode, and finally my computer is working again. No more experiments!
Regardless of the good cooler, the 9800GT was becoming too hot. Apparently there is just not enough air and ventilation - thanks again to Dell motherboard design. The video card is positioned at the very bottom, next to the only PCI slot, so there is less than an inch space between the radiator and the bottom of the main case chamber. The optional fans of the Turbo Module barely fit there, almost touching the wires. Under heavy load the card temperature was over 100°C!
So I decided to return the original ATI HD5670 back - it's a bit less powerful, but being newer it uses less power than 9800GT, so runs much cooler. I put the card back thinking I was just returning to the original configuration, but what a mistake! Well, the old PC didn't have any issues - Windows XP just reused the nVidia driver, I just disabled autostart of the Catalyst, which was complaining about the missing ATI card.
Windows 7 in the new PC didn't have problems either - again, just reused the ATI driver, and I re-enabled the Catalyst autostart.
But completely different story with Ubuntu. It just didn't boot to the GUI! The monitors stayed black and were blinking with the power indicators, which meant there was no proper signal from the video card. Booting in recovery mode and attempt to reconfigure graphics didn't help. Neither the playing with xorg.conf. I removed everything related to nVidia - no luck. Removed ATI drivers as well - nothing... Then somewhere in the logs I noticed mentioning of VMWare and some warnings. I remembered that VMWare gets pretty deeply into the system, it even compiled some kernel modules during installation. So I decided to give it a try. Found how to uninstall it (followed instructions on this page: sudo vmware-installer -u vmware-player), and then after more playing with recovery mode that the system finally could boot successfully into X.
It wasn't the end though. There was no network connection! Apparently when VMWare was ununstalled, it took the network with it. More googling helped to find that "eth0" is not in /etc/network/interfaces anymore. Just added it to the first line ("auto lo eth0"), rebooted, and voilĂ - it worked!
Now I have just reinstalled ATI drivers, and re-configured the multi-display mode, and finally my computer is working again. No more experiments!
Sunday, April 17, 2011
New PC: Video cards swapping
ATI HD5670, which came with my new PC, is not very powerful video card. It's better than some totally budget ones, but it's definitely at the low end. So I decided to compare it with the nVidia 9800GT, which I had in my old PC, and probably swap them. Several years ago 8800GT (which 9800GT is based on, it's basically the same) was almost on the top of the line, and according to the tests I found on Internet, it's still a bit faster than the HD5670.
But first I wanted to replace the cooler on that 9800GT, because the original one was too loud and the card was getting pretty hot on load. Checked what NCIX had on sale, and bought this one - Accelero S1 Rev.2 passive VGA cooler:

The drop in the temperature was huge. With the old cooler the GPU/Ambient temperatures were ~70°C/47°C on idle, and ~80°C/56°C on load. With the Accelero S1 they dropped down to 45°C/34°C and 68°C/43°C accordingly. And that's without fans, in complete silence! Heat pipes are very efficient!
So, I tried both cards in two computers, ran 3DMark06 on both, and here's the result:
Based on these results, I'm leaving the 9800GT in the new machine, and will see how it will work.
But first I wanted to replace the cooler on that 9800GT, because the original one was too loud and the card was getting pretty hot on load. Checked what NCIX had on sale, and bought this one - Accelero S1 Rev.2 passive VGA cooler:

The drop in the temperature was huge. With the old cooler the GPU/Ambient temperatures were ~70°C/47°C on idle, and ~80°C/56°C on load. With the Accelero S1 they dropped down to 45°C/34°C and 68°C/43°C accordingly. And that's without fans, in complete silence! Heat pipes are very efficient!
So, I tried both cards in two computers, ran 3DMark06 on both, and here's the result:
3DMark Score | NVIDIA GeForce 9800 GT, 512MB | ATI Radeon HD 5670, 1GB |
Old computer (Intel Core 2 Duo Processor E7300 2.66 GHz) | 11101 | 9987 |
New computer (Dell XPS 9100, Intel Core i7-930 Processor 2.8 GHz) | 12984 | 11452 |
Based on these results, I'm leaving the 9800GT in the new machine, and will see how it will work.
Monday, April 11, 2011
New PC: Why I don't like brand named PCs
I was complaining about limited SATA connections on Dell motherboard.
Turned out I was wrong. Look at this picture - here are the available connectors:
Those black and blue ones at the edge of the board are used for the the HDD and DVD drives, so there was only one available. But when I was checking BIOS setup today, I noticed it lists more SATA slots. I did one more closer inspection, and finally understood what those numbers and text near the SATA ports on the motherboard mean. Those black and blue connectors are actually double ports! Right under the ones, which are visible and have cables connected, there are two more connectors, which are hidden and not noticeable from any possible angle. And even when I found them, it was very difficult to connect the SATA cables there - without seeing and with limited access.
I've never seen the SATA connectors design like this! If I had a manual for the motherboard, I could figure that out earlier, but alas, seems like there is no such manual coming with Dell XPS.
So, finally it looked like I could connect my other HDD. But that was a hasty conclusion. I was still unable to connect the additional drive, because the SATA power cable was not long enough. It was designed to work only for a specific hard drives position in the original XPS case - the distance between the connectors is very short. I could use one cable with three connectors to power two drives, and the second cable, even though it had three more connectors, could only be used either for the DVD drive, or for another hard drive, but not for both of them. Normally the power supplies have at least a couple of molex connectors. Even though they are not used for the HDDs these days, there are adapters for other components, like fans, special motherboards, PCIe cards. But Dell power supply does not have any, so I couldn't even use this way. I need a SATA power splitter.
How many more surprises should I expect from this Dell?!?
Here's a list of annoyances I had to face with this Dell:
The main idea is: all the components of this Dell PC were designed for a specific configuration. The customization/upgrade/replace abilities are extremely limited. This kind of a PC is intended for the users who are not expected to do anything with it. Want to change something? Order another configuration or even another model!
It is not for me for sure. I've never wanted to buy a brand name PC for myself - saw several of such machines, and every one of them used some custom components, which were difficult to customize or upgrade. This is the first time when I had to get a brand name PC, and it will be the last one!
Turned out I was wrong. Look at this picture - here are the available connectors:
Those black and blue ones at the edge of the board are used for the the HDD and DVD drives, so there was only one available. But when I was checking BIOS setup today, I noticed it lists more SATA slots. I did one more closer inspection, and finally understood what those numbers and text near the SATA ports on the motherboard mean. Those black and blue connectors are actually double ports! Right under the ones, which are visible and have cables connected, there are two more connectors, which are hidden and not noticeable from any possible angle. And even when I found them, it was very difficult to connect the SATA cables there - without seeing and with limited access.
I've never seen the SATA connectors design like this! If I had a manual for the motherboard, I could figure that out earlier, but alas, seems like there is no such manual coming with Dell XPS.
So, finally it looked like I could connect my other HDD. But that was a hasty conclusion. I was still unable to connect the additional drive, because the SATA power cable was not long enough. It was designed to work only for a specific hard drives position in the original XPS case - the distance between the connectors is very short. I could use one cable with three connectors to power two drives, and the second cable, even though it had three more connectors, could only be used either for the DVD drive, or for another hard drive, but not for both of them. Normally the power supplies have at least a couple of molex connectors. Even though they are not used for the HDDs these days, there are adapters for other components, like fans, special motherboards, PCIe cards. But Dell power supply does not have any, so I couldn't even use this way. I need a SATA power splitter.
How many more surprises should I expect from this Dell?!?
Here's a list of annoyances I had to face with this Dell:
- It is too loud
- Cheap and extremely noisy VGA cooler
- Stock CPU cooler - noisy and not very good
- No HDD vibration dampers
- Non-standard design of the motherboard, and no documentation for the motherboard
- A bigger and better 3rd party CPU cooler doesn't fit
- Difficult to find and use extra SATA ports
- Not enough fan power connectors on the motherboard
- Only Dell front panel with non-standard USB ports can be connected (and cannot run without it)
- No molex connectors from the power supply
- SATA power cables are designed to be used only for a specific HDD configuration
- Not enough USB ports on the back
The main idea is: all the components of this Dell PC were designed for a specific configuration. The customization/upgrade/replace abilities are extremely limited. This kind of a PC is intended for the users who are not expected to do anything with it. Want to change something? Order another configuration or even another model!
It is not for me for sure. I've never wanted to buy a brand name PC for myself - saw several of such machines, and every one of them used some custom components, which were difficult to customize or upgrade. This is the first time when I had to get a brand name PC, and it will be the last one!
Subscribe to:
Posts (Atom)