Sender | Message | Time |
---|---|---|
6 Nov 2024 | ||
17:18:22 | ||
17:20:16 | ||
8 Nov 2024 | ||
Download Community.jpeg | 13:35:50 | |
11 Nov 2024 | ||
Hey all. I'm back to working on my MW 1.39.7 to .10 upgrade, which includes upgrading SMW from 4.1.3 to 4.2, which adds the FacetedSearch feature requiring update.php to be run. The first time I tried and didn't realize the script had to be run, changes to user preferences couldn't be saved. I'm not clear on whether deploying the upgrade and then running update.php on each of the wikis would necessitate a downtime for the wikis, but it's probably safest to do so, just in case. Is anyone here able to elaborate on the specific changes being made by update.php for this upgrade, as I can't seem to figure that out from digging through the SMW code? | 18:01:29 | |
IIRC when I last brought this up, I think someone suggested that this new feature might have been better released in 5.x, since normally a minor release like this shouldn't require update.php . | 18:02:23 | |
Well, update.php doesn't appear to change the .smw.json upgrade_key value, so the dev wiki I tested on didn't seem to break after the update.php run even though I haven't yet deployed the updated .smw.json file. So that's a good sign, but I'm still leaning towards a downtime anyway, just to be safe. | 19:26:15 | |
* Well, update.php doesn't appear to change the .smw.json upgrade_key value (only changed previous_version , latest_version , and last_optimization_run ), so the dev wiki I tested on didn't seem to break after the update.php run even though I haven't yet deployed the updated .smw.json file. So that's a good sign, but I'm still leaning towards a downtime anyway, just to be safe. | 19:27:13 | |
Here's another annoyance: the .smw.json files are changed but the sizes are the same, so my deploy script on my wiki manager server, which uses rsync, doesn't see a file size change so it doesn't try to push them down to the web servers. However, using rsync's --checksum argument would cause it to check the checksum of all ~ 200,000 files, which would take probably an hour. | 20:30:00 | |
* Here's another annoyance: the .smw.json files are changed but the sizes are the same, so my deploy script on my wiki manager server, which uses rsync, doesn't see a file size change so it doesn't try to push the updated versions down to the web servers. However, using rsync's --checksum argument would cause it to check the checksum of all ~ 200,000 files, which would take probably an hour. | 20:30:19 | |
To be fair, this is really a concern for ANY files that change anywhere in MW, extensions, etc. where the contents change but the size doesn't. It's one reason I want to change from an rsync-based deploy system to the web servers' target location just being git clones so instead of rsync I can just orchestrate git pull on them. | 20:41:41 | |
Sorry for all the spam, but I just noticed the SMW 4.2 upgrade docs also say to add --optimize-autoloader to the composer update command. I've never used that option so I'm wary of using it now. Am I safe to leave it out and perhaps start using it with my MW 1.43 upgrade next year? | 21:33:52 | |
12 Nov 2024 | ||
FWIW, the new Special:FacetedSearch page has a link, https://www.semantic-mediawiki.org/wiki/Faceted_search, which goes to a "blank page". | 17:50:54 | |
Download image.png | 17:50:58 | |
Why would it take an hour? My regular deployment system makes a full clone (using rsync, but could also be cp) and that doesn't take many minutes. | 18:21:34 | |
I do not have any stats to provide, but I've always used --checksum when rsyncing MediaWiki codebases and didn't notice any performance problem. IOW, it's fast. | 18:34:29 | |
In reply to @justinl:matrix.orgYou don't really need to worry about this... the optimize-autoloader configuration is already present in MediaWiki's composer.json file | 18:35:50 | |
If referring to composer commands within the SemanticMediaWiki directory, then it's a good option to use. | 18:37:02 | |
Besides the top-level, I've only ever had to run Composer within the Widgets directory. When would it need to be run from within the SMW directory? | 18:38:12 | |
In the big picture, MediaWiki controls autoloading first (with the Autoload class in core) and generates its own classmap. Then at the end, it loads the composer (optimized) classmap if present. | 18:38:52 | |
In reply to @justinl:matrix.orgI don't know off the top of my head. I'm normally using composer from $IP, and relying on composer.local.json to add extension classes | 18:41:23 | |
Same. My build script dynamically generates per-wiki composer.local.json files, creating the extra.merge-plugin.include list of composer.json files it finds in the extensions we install. I just had to add a special run of Composer inside Widgets because of the smarty library requirement. | 18:44:55 | |
To be clear, if you
you will get an optimized autoloader from composer (in ./vendor) that will be loaded by MediaWiki after it loads all extension classmaps. | 18:46:05 | |
I must admit that I don't really know what that means. I've never really had to work with the autoloader or "extension classmaps", so I have no if/why I would need to do something like what you wrote. | 18:49:45 | |
* I must admit that I don't really know what that means. I've never really had to work with the autoloader or "extension classmaps", so I have no idea if/why I would need to do something like what you wrote. | 18:49:56 | |
If you're not doing any development, you don't need to worry about autoloading... it's already "auto"matic 🙂 | 18:51:01 | |
Good to know, thanks. Over the years I've learned things here (and there) that I should have been doing, or doing differently, so I wouldn't be surprised if there were still various build and/or maintenance things I should be doing. | 18:53:12 | |
On the other hand, if you want to test something out (e.g. see if Chameleon skin 5 with Bootstrap 5 works on your dev environment), then you can use composer on the command line to muck with what is installed. In that case, you might need to understand autoloading, the Opcache, etc. to avoid headaches. | 18:53:44 | |
I do plan to further investigate containerizing the wikis to some degree, not sure how MW config files and such would work with that, but a couple of colleagues and I are going to go over that and see what we can figure out. | 18:53:56 | |
Anyone offering a "MW Developers Course" where I could learn that? :) | 18:54:51 | |
MWStake would love to offer such a course, I think! | 18:55:25 |