6
I Use This!
Moderate Activity

News

Analyzed about 23 hours ago. based on code collected 3 days ago.
Posted over 15 years ago by Alon Swartz
Hi, my name is Alon and I'm a news-a-holic. Lets face it, some of us have addictions, be it alcoholism, nicotine, gambling, over-eating, television, or even just mowing the lawn. Mine is being up to date with the latest news. Some addictions are ... [More] worse than others in being self-destructive, and as you already know, identifying and acknowledging you have a problem is the first step. I am also a work-a-holic, so when I realized that having to know the latest news was impacting my workflow, I needed to find a solution. Luckily, with a little self discipline I was able make a rule and stick with it: "Access to RSS reader is off limits from work laptop". The problem was how to still consume news, with the added bonus of easily sharing interesting articles on Twitter that I come across (I am reading the news anyway, so why not share it). So, this is what I came up with. I have been a long fan of RSS, and have a long list of feeds set up in Google Reader, which I now only access from my mobile. When I come across something that is really interesting, I hit "share". Everything I share in Google Reader appears on my shared items page, which has an RSS feed of its own. Updating Twitter with an RSS feed is simple thanks to services like twitterfeed. I have also configured twitterfeed to use my bit.ly account (URL shortener) so I can get click-through stats. Initially I planned on tweaking news items I share according to the stats, but in practice I don't. I don't even checkup on the stats, so I guess its just nice-to-have. So there you have it: Sites that interest me are fed into Google Reader. Articles that I share appear in my RSS feed. Twitterfeed picks up the RSS feed, shortens the links with bit.ly, and sends them to Twitter. Do you have an addiction? How do you tweet? Leave a comment! [Less]
Posted over 15 years ago by Liraz Siri
Smack yourself in the forehead if you don't use the following snippet (or an equivalent) in all custom shell scripts that could benefit from a hooking mechanism: run_scripts() { for script in $1/*; do # skip non-executable ... [More] snippets [ -x "$script" ] || continue # execute $script in the context of the current shell . $script done } run_scripts path/to/hooks.d I've found this pattern useful in nearly every sufficiently complex shell script I maintain, and I even use it to help manage my bashrc and xsession configurations: $ ls .bashrc.d/ paths git editor pager autologout qemu tmpdirs scratch $ ls .xsession.d/ 10-lang 10-tmpdir 90-bell 90-kbdrate 91-xscreensaver 99-fvwm-conf Features: Modular: add or remove code without having to edit a big monolithic file. Order of execution is determined by the filename. Changing the order is as simple as changing a number prefix. Disable execution by removing execution bit: chmod -x .bashrc.d/git These days breaking down configurations into separate modular files like this is common in the Linux world, so by now I expect many experienced users are wondering why I'm channeling Captain Obvious. Just keep in mind that many Linux newbies haven't yet learned all our best practices and some lessons are worth reteaching. Why not share your own tricks? Post a comment! [Less]
Posted over 15 years ago by Alon Swartz
Or, Celery RabbitMQ = Django awesomeness! As you know, Django is synchronous, or blocking. This means each request will not be returned until all processing (e.g., of a view) is complete. It's the expected behavior and usually required in web ... [More] applications, but there are times when you need tasks to run in the background (immediately, deferred, or periodically) without blocking. Some common use cases: Give the impression of a really snappy web application by finishing a request as soon as possible, even though a task is running in the background, then update the page incrementally using AJAX. Executing tasks asynchronously and using retries to make sure they are completed successfully. Scheduling periodic tasks. Parallel execution (to some degree). <!--break--> There have been multiple requests to add asynchronous support to Django, namely via the python threading module, and even the multiprocessing module released in Python2.6, but I doubt it will happen any time soon, actually I doubt it will ever happen. This is a common problem for many, and after scouring over many forum posts the following proposed solution keeps popping up, which reminds of me of the saying "when all you have is a hammer, everything looks like a nail". Create a table in the database to store tasks. Setup a cron job to trigger processing of said tasks. Bonus: Create an API for task management and monitoring. Well, you can do it like that, but it usually leads to ugly, coupled code, which can become very complex over time, not very flexible, doesn't scale well, and generally a bad idea. In my opinion, it ultimately comes down to seperation of concerns. I recently fell in love with the message queuing world (AMQP), in particular RabbitMQ, which can be used as an integral part of a really elegant solution for this issue, especially when coupled with Celery. Define a task. Send it to a processing queue. Let other code handle the processing.   What is Celery Celery is a task queue system based on distributed message passing.  Originally developed for Django, it can now be used in any Python project. It's focused on real-time operation, but supports scheduling as well. The execution units, called tasks, are executed concurrently on a single (or multiple) worker server. Tasks can execute asynchronously (in the background) or synchronously (wait until ready). Celery provides a powerful and flexible interface to defining, executing, managing and monitoring tasks. If you have a use-case, chances are you can do it with Celery.   Installation and configuration   Install Celery One of Celery's dependencies is the multiprocessing module released in Python2.6. If you have an earlier version, such as Python2.5, you're in luck as the module has been backported. When installing the backported module, it will need to be compiled, so lets install the required support. apt-get install gcc python-dev Now we are ready to install celery, lets install a few more dependencies and let easy_install take care of the rest. apt-get install python-setuptools python-simplejson easy_install celery   Install RabbitMQ Celery's recommended message broker is RabbitMQ. RabbitMQ is a complete and highly reliable enterprise messaging system based on the emerging AMQP standard. It is based on a proven platform, offers exceptionally high reliability, availability and scalability. In the below example, I will download and install the latest release (at time of writing), but you should check their download page for newer versions and/or support for your platform. Note: Installation will fail if there are missing dependencies. Because of this, we use the --fix-broken workaround. wget http://www.rabbitmq.com/releases/rabbitmq-server/v1.7.2/rabbitmq-server_... dpkg -i rabbit-server_1.7.2-1_all.deb apt-get --fix-broken install The default installation includes a guest user with the password of guest. Don't be fooled by the wording of the account, guest has full permissions on the default virtual host called /. We will use the default configuration below, but you are encouraged to tweak your setup.   Configure Django project to use Celery/RabbitMQ Add the following to settings.py BROKER_HOST = "127.0.0.1" BROKER_PORT = 5672 BROKER_VHOST = "/" BROKER_USER = "guest" BROKER_PASSWORD = "guest" INSTALLED_APPS = ( ... 'celery', ) Synchronize the database python manage.py syncdb   Sample code Now that everything is installed and configured, here is some sample code to get you started. But, I recommend taking a look at the Celery documentation to get acquainted with its power and flexibility. fooapp/tasks.py from celery.task import Task from celery.registry import tasks class MyTask(Task): def run(self, some_arg, **kwargs): logger = self.get_logger(**kwargs) ... logger.info("Did something: %s" % some_arg) tasks.register(MyTask) fooapp/views.py from fooapp.tasks import MyTask def foo(request): MyTask.delay(some_arg="foo") ... Now start the daemon and test your code. python manage.py celeryd -l INFO For convenience, there is a shortcut decorator @task which makes simple tasks that much cleaner. A note on state: Since Celery is a distributed system, you can't know in which process, or even on what machine the task will run. So you shouldn't pass Django model objects as arguments to tasks, its almost always better to re-fetch the object from the database instead, as there are possible race conditions involved.   Have you ever needed to use background/deferred execution in Django? Post a comment! [Less]
Posted over 15 years ago by Liraz Siri
I recently eliminated a bit of code that was supposed to handle upgrading our build infrastructure from using one distribution (e.g., Ubuntu 8.04 LTS) to another (e.g., Ubuntu 10.04 LTS). That got me thinking about how to decide (and then explain) ... [More] when it's a good idea to automate and when it isn't. <!--break--> Since writing and maintaining any piece of software comes with a cost, you always have to weigh the costs against the benefits. I boiled it down to the following: Good automation: the best kind of automation can reliably handle tasks that it is slow, error prone and labor intensive to perform manually. That kind of automation is usually a good idea because once it works well enough you can basically forget about it and enjoy the benefits from an accelerated development cycle (AKA tightened developer feedback loop), reduced amount of friction from debugging human mistakes. It basically frees up your mind and precious labor for other tasks. Bad automation: the worst kind of automation handles in an unreliable, error-prone way tasks that are infrequently performed and simple to handle manually. Not only do you have to pay for the cost of developing and maintaining this kind of bad automation, you also get an overall decrease in productivity for your efforts because the automation mechanism will require more maintenance and attention then the tasks it supposedly handles. A litmus test for good vs bad automation is the ability to test. For example, if you try to automate something that happens infrequently there is a good chance that the assumptions embedded in your automation will not hold over time. So case in point, since there is no way to test that a bit of automation will work for a future release transition, it's safe to assume it probably won't. [Less]
Posted over 15 years ago by Alon Swartz
Appliance:  Joomla Appliance We have just finished successfully testing the Joomla 1.5.15 package on the 2009.10 release, and have updated our package archive to include it. ... [More] Joomla 1.5.15 is quiet a large update, and includes moderate security fixes. Due to the large update, and the fact that the security fixes are not considered critical, appliances will not upgrade automatically for stability reasons. You should upgrade at your earliest convenience: apt-get update apt-get install joomla15 [Less]
Posted over 15 years ago by Alon Swartz
Appliance:  Bootstrap JeOS Upgraded base distribution to Ubuntu 8.04.3 LTS Links Release meta-files (signature, manifest)  
Posted over 15 years ago by Alon Swartz
Appliance:  phpBB Appliance Changes: Bugfix: Fixed database restore functionality (broken due to phpbb package permissions bug LP#364379) Bugfix: Webmin Firewall ... [More] configuration. phpBB improvements: Added pmadb (linked tabled) advanced features to PHPMyAdmin (LP#426303) Pinned PHPMyAdmin to update directly from Debian stable (security). di-live (installer) MySQL component: Added support for complex passwords (LP#416515). Added CLI options (user/pass/query/chroot). Bugfix: Removed build systems hostname from MySQL user table. Regenerates all secrets during installation / firstboot (security).  Major component versions phpbb3 3.0.2-4 mysql-server 5.0.51a-3ubuntu5.4 apache2 2.2.8-1ubuntu0.11 phpmyadmin 2.11.8.1-5+lenny1 Note: please refer to turnkey-core's changelog for changes common to all appliances. Here we only describe changes specific to this appliance. Links Release meta-files (signature, manifest)   [Less]
Posted almost 16 years ago by Liraz Siri
I'm proud to announce the 2009.10 release batch featuring: 25 new additions to the TurnKey Linux virtual appliance library added native virtual appliance packaging (OVF support included) Amazon EC2 support, with EBS persistence Core improvements: Ajax web shell, upgraded to Ubuntu 8.04.3
Posted almost 16 years ago by Liraz Siri
Appliance:  Bugzilla Appliance Changes: Initial public release of TurnKey Bugzilla. SSL support out of the box. Bugzilla configurations. Configured cron jobs to collect ... [More] stats and whine. Includes support for dependency graphs. Includes documentation. Disabled upgrade notification (handled by APT). Pinned Bugzilla and related to update directly from Debian (security). Includes Postfix MTA (bound to localhost) to allow sending of email from Bugzilla (e.g., password recovery). Also includes webmin-postfix module for convenience. Regenerates all secrets during installation / firstboot (security). Major component versions bugzilla3 3.0.4.1-2+lenny1 mysql-server 5.0.51a-3ubuntu5.4 apache2 2.2.8-1ubuntu0.11 Note: Please refer to turnkey-core's changelog for changes common to all appliances. Here we only describe changes specific to this appliance. Links Release meta-files (signature, manifest) [Less]
Posted almost 16 years ago by Liraz Siri
Appliance:  Revision Control Appliance Changes: initial public release of TurnKey Revision Control includes turnkey web control panel (convenience) SSL support out of the box ... [More] version control systems with web frontends: git (gitweb): git://addr/git svn (websvn): svn://addr/svn bzr (loggerhead): bzr://addr/bzr mercurial (hgweb): http://addr/hg includes custom developed init scripts for bzr and svn includes bzrtools, bzr-rebase, subversion-tools (useful extras) includes exemplary helloworld repositories pinned websvn to update directly from debian (security) regenerates all secrets during installation / firstboot (security) Major component versions git-core 1:1.5.4.3-1ubuntu2.1 gitweb 1:1.5.4.3-1ubuntu2.1 bzr 1.3.1-1ubuntu0.1 loggerhead 1.10-1turnkey+8+g5cd7b60 subversion 1.4.6dfsg1-2ubuntu1.1 websvn 2.0-4+lenny1 mercurial 0.9.5-3 apache2 2.2.8-1ubuntu0.11 build-essential 11.3ubuntu1 note: please refer to turnkey-core's changelog for changes common to all appliances. Here we only describe changes specific to this appliance. Links Release meta-files (signature, manifest) [Less]