I recently worked with a WordPress MU installation. As an ease of deployment and probably because it is largely used on shared hosting which may not offer access to cron jobs, WordPress default way of processing its scheduled tasks is through front-end and admin page requests. As a result, everytime a page is requested, the system looks to see if there are jobs to run. When so a call is performed to execute the tasks. The asynchronous nature of the call makes it somewhat transparent from the webuser perspective.
Although it works well, I do not feel confident to use this mechanism on a site with a lot of traffic. After some google searches to find out how it works, I’ve read many stories about hosting companies denying access to wp-cron.php because of bad impact on their server.
I had another issue while using WP-SuperCache. Since the super cache is super because it prevents loading the whole PHP engine for guest requests. Unfortunately, no php code execution means no cron execution for super cached pages… Since the cron is responsible of cleaning the super cached files, they are served indefinetly or until a user logs in (logged users do not get super cached files).
So for all the good reason I had, I wanted to run WordPress cron jobs from a real cron schedule.33 comments
It’s been a long time since I post to this blog… I’ve been searching for free time for years without success… With the summer season starting, it is even worst! So many projects in my mind! Like riding a motorcycle , growing vegetables, getting a copper tone, etc. But it is now behind, winter is coming! as well as blog posts…
In the middle of all these projects, I have succeed to work another one about motocross. I have built a website that is quite cool. It is based on Joomla, take a look at it: http://www.mxdenzki.com.
In the coming weeks, I will post about my experience with Joomla and why I pick this CMS from the free cloud!
From release to release, Ubuntu has always enhanced their desktop operating system. Because it is always working better, I have always been curious to install the new edition as soon as possible.
Now Ubutun offers “Ubuntu One”, that is 2 GB of online storage for free. No need to say it is integrated into Ubuntu 9.10… Well it is time to start to backup some stuff…
I will give a try to Ubuntu 9.10 desktop in 64 bits edition and this Ubuntu One service to see how it fit in my mixed OS environment…
You can download a version here: http://www.ubuntu.com/getubuntu/download
You have a netbook, check this: http://www.ubuntu.com/getubuntu/download-netbook2 comments
I have moved my domain to another hosting company and I also took to opportunity to update my blogging software (WordPress) to a newer version. I noticed some glitches I fixed. If you are aware of anything not working, just post a comment to this message.
In my quest for learning, I did my first step with Linux back into 2006 with the 6.06 LTS version. Since that time, I have learned quite a lot, but still consider myself a newbee.
From that perspective, Ubuntu has come long way. Every 6 months, since Ubuntu 6.10, I have tested each release. In general it was always a step forward for user friendlyness. Ubuntu 8.10 is no exception… and in my opinion it is the best Ubutu release ever. Ubuntu Intrepid Ibex is somewhat what I was expecting from 8.04.
The new artwork makes it feel different than previous release. While exploring the menus and configurations you will notice that it is not just a feeling. In the past, to configure a wireless network, you had to go to System / Administration / Networking. Then setting your WEP key (if using WEP) and activate the adapter. Often I had to perform “activate+deactivate” a couple of time to finally get a working connection.
Now the network manager can be found under System / Preferences / Network configuration, but I didn’t have to go there since a tooltip appeared at first logon stating “wireless networks available”. I just had to select the right SSID and configure the WEP key from there. On top of that, the connection was established within seconds after first try…
Peoples definition of “home computer” has changed quite a bit. Previously, home computer was synonym of desktop PC, now if notebooks sales have not exceeded desktop sales, I guess it is just a matter of time… I think, with this release, Ubuntu followed this wave of mobility.
When you have a home server, you may not have the chance to be connected to the Internet using a static IP address, instead your IP address is a lease from your Internet Service Provider and may change every day depending on the setup.
So when you are in the outside world, it is not possible to guess the IP address of your computer, how could you connect to it if you wanted to? The solution is to use a dynamic DNS forwarder like DynDNS.org. Talking about DynDNS, many standard firmwares (Netgear, Linksys, etc.) support updating your IP when it changes. Since I use Tomato firmware, I will explain how to configure it.
- First, you must register on DynDNS.org by creating an account.
- Then go into the “My Services / Host services” section of the site and add a new host name
- The free service offers you to choose any sub-domain name from their available list.
(ex: myhostname.getmyip.com, myhostname.kick-ass.net)
- So you have to enter:
- the name you want (myhostname)
- the domain name (kick-ass.net)
- if you want to enable wildcards
(if enabled, anything.myhostname.kick-ass.net will be forwarded as well)
- the service type: Host with IP address
- leave the IP address, Tomato will update it automatically
- leave the mail router checkbox unchecked
- create that host…
- Now you have to configure Tomato
- Navigate to the router’s web administration
- Open up the page under Basic / DDNS.
- Tomato offers to configure up to 2 host names, let’s fill the first one:
- IP Address: Use WAN IP xxx.xxx.xxx.xxx (recommended)
- Service: DynDNS – Dynamic
- Username: your account name
- Password: your account password
- host name: myhostname.kick-ass.net
I recently updated my blog categories. If you have (or had) hard time finding what you were searching for, please submit a comment to this post and I will rectify.
I also made tags navigation available. The list of tags is displayed just under the category of a post.
If some of you have been using PHP on Windows with SQL Server 2005, you may have hit some problems especially if you wanted to exploit new features of SQL Server 2005 like xml datatype, NVARCHAR(MAX), etc.
The driver that had the better support for these were the PHP ODBC wrapper combined with the SQL Server 2000 ODBC driver. The bad news is, by default, PHP ODBC uses server-side dynamic cursors which is the thing Microsoft says to avoid as much as possible (unless you have a need for that). It is very slow, server resource intensive, poor performing, etc.
Some succeeded to change the way ODBC were handling resultset by using a hint at connection time (SQL_CUR_USE_ODBC), but it didn’t help for us. Some perfectly valid parameterized queries where just giving unexpected results.
I even downloaded PHP’s source code to see why it was using dynamic cursors by default. If I could, at least, change the default cursor, we may had a little performance increase… It was hardcoded to “dynamic” with the following comment on top of it:
Try to set CURSOR_TYPE to dynamic. Driver will replace this with other type if not possible.
So next thing would be to change it and recompile… forgot about it!
Then, some time at the end of 2007, I discovered an alpha community preview release of a new driver made by Microsoft. Wohoo! This version was unstable with xml datatypes at a point that it was making my Apache server crash…
Fortunately, the official release finally got out et we are testing it for some time now! So far, there are no blocking bugs. The quality is good enough that I took the time to created a Creole wrapper (our web application uses Creole as database wrapper API) for it and start using it full-time on our develpment environment.
Here are some observations:
- On my laptop, based on a non-official, non-extensive performance test, I had a 400% to 500% performance boost for fetching 200 records of a large (numerous fields) table.
- UTF-8 support exists, but conversion must be done manually, field by field (better have a database wrapper API…) and at a huge performance cost.
- UTF-8 support works only for query parameters and resultset values. If you hardcoded a query filter (I know it is not a best practice, but we all supports legacy applications…) you will have to rewrite it with parameters or drop UTF-8 support.
- If you are using PHP from a Linux server, you are still left alone because the Microsoft driver relies on the ODBC SQL Server Native Client driver that works only on Windows.
*** UPDATE : 2009-02-04 ***
Thanks to Piotr initiative, a new creole driver is now available (under LGPL license) here: