CaptainN2003 at yahoo.com
Tue Sep 18 16:01:16 GMT 2007
I have been thinking about how to optimize the js files from WordPress
for a while as well (especially the ginourmous Prototype.js). There are
two problems that I can see with using packer.
The first is what you already mentioned. There can be problems if the js
has not been prepared with packer in mind (not using a semicolon after a
function assignment for example).
The second is that some of the included js libraries are already
compressed using either packer or something else (jQuery).
There is one way that I have been thinking could help quite a lot
though, especially for people like myself, who are on Windows hosts
which doesn't gzip by default.
Since WordPress has a queue script function, WordPress knows about all
the scripts that would be needed before they are output to the header
(assumption, I didn't look into the code yet). So why not glue all the
queued scripts together in a single file and then gzip them in one shot.
This would reduce the number of connections to the server, as well as
reduce the file size of the gzipped scripts.
And since it's going through php, we could add other performance
enhancing features, such as far future expires headers (we could
construct a url that contains information about the included files, and
their versions so that updates could be retrieved by clients, if things
need to change).
Once that's in place, we could even go ahead and look into adding some
experimental packer or jsmin or something else.
TinyMCE already has a similar feature (it packs it's many separate js
files into one, then gzips it using php).
Alex Günsche wrote:
> Hi all,
> server-based compression, it's always a big bunch of bytes to download.
> Ok, most of it is in the admin backend, but still, I think there's a
> possiliblity for optimizazion -- especially if you think of WPMU
> Some of you might know the nice JS packer by Dean Edwards , e.g. used
> by MooTools  by default. It is a very clever tool to "compress"
> I have made some tests on my WordPress installs, one is WP 2.0.11, the
> other is 2.3-beta3.
> With a little shell foo, you'll find out the amount of JS code in the WP
> sum=0; for j in $(for i in $(find ./wp-admin/ ./wp-includes/ -iname "*\.js"); do du -b $i | cut -f 1; done); do sum=$[ $sum+$j ]; done; echo $sum
> The outcome is: 357642 bytes for WP 2.0.11 and 1020175 bytes (!) for WP
> Now I've taken the php5 version of the packer  to test it on the JS
> files in the WP core. I extracted the package to WP_ROOT/jspacker/ and
> uncommented the argv section in example-file.php. Then I ran the
> for i in $(find ./wp-admin/ ./wp-includes/ -iname "*\.js" | sed 's|\.js$||'); do cp $i.js $i.src.js; php ./jspacker/example-file.php $i.src.js $i.js; done
> Running the first command line again (the sum stuff) and subtracting the
> results from the respective previous results (because it also counts the
> new/copied src files) shows that the compressed JS files now make a
> total of 195703 bytes (2.0.11) and 562329 bytes (2.3-beta3). This is
> about 50% of the total size saved.
> Now the big question is: Does it break the JS? It is to say, that in
> rare cases, the packer can break JS code. And indeed, some short tests
> on both versions showed the following:
> - Most JS, including the remote stuff, works as expected.
> - On 2.0.11, TinyMCE seemed not to be able to initialize, so WP came
> with the fallback editor, though that one worked.
> - On 2.3-beta3, TinyMCE works fine, though switching between Visual and
> Code doesn't work.
> In Firefox, the web developer toolbar shows some errors with the syntax
> in the packed files, which explain the above problems.
> Bottomline so far: There is a possibility for much optimization in the
> size of JS downloads. There are some minor problems, which can be solved
> though (e.g.: fix files manually, not compress files that seem to break,
> fix packer).
> Now one can even imagine to go one step further: I find it also annoying
> that one has to download so many files. In theory, it would be better to
> pack all JS into *one* big file and let the UA download that one. This
> would significantly reduce the number of HTTP requests.
> Of course, I wouldn't want to merge TinyMCE, Prototype and all the other
> JS in one file in the source distribution, but one could imagine having
> a backend JS handler that gathers all JS files to be inserted and
> delivers them as one. We do have API hooks for JS (at least in the
> admin), so it wouldn't be too hard to gather the files.
> It's even possible to imagine a global on-the-fly packer combined with a
> JS cache, which could again bring additional performance.
> Just some ideas ... what do you guys think?
> Kind regards,
>  http://dean.edwards.name/packer/
>  http://mootools.net/download
More information about the wp-hackers