#foswiki 2017-10-02,Mon

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
***ChanServ sets mode: +o MichaelDaum [06:28]
..................... (idle for 1h41mn)
zak256MichaelDaum: Good morning. I heard you can perhaps help me with DBCachePlugin/DBCacheContrib? I installed these both on our new 2.1.4 install and have the problem that pages of some webs are not loaded. After some time the webserver runs in a timeout. I just now set TRACE=1 in WebDB.pm and see *a*lot* of messages saying "reloading ..." in my apache error log. It seems like every single page of that web is read. Does that tell you something
I now see, those messages are written again even after I got the timeout and "Internal Server Error" message in the browser. It just keeps on doing something.
[08:09]
MichaelDaumHi zak256 [08:12]
zak256Hi Michael. [08:12]
MichaelDaumI've seen your reports in the logs
something about using an undefined $text in substitutions etc
[08:12]
zak256Yes I got that one time, but don't know if it's related. [08:13]
MichaelDaumthis is odd, as $text is set by an ordinary readTopic() operation
which returns ($meta, $text) ... which both should be defined ... but unfortunately aren't in your case
[08:14]
zak256Right now I don't see any of these messages in the log. [08:14]
MichaelDaumwhich store impl do you use: rcs or plain file? [08:15]
zak256rcs [08:15]
FoswikiBotrcs is a well proven, very reliable store. Been in Foswiki and TWiki for a decade or more [08:15]
zak256For now I assume these substitution errors have a different reason. [08:16]
MichaelDaumFoswikiBot, spot on [08:16]
FoswikiBotMichaelDaum: Search me, bub. [08:16]
MichaelDaumwhat are your settings for $Foswiki::cfg{DBCachePlugin}{MemoryCache}, $Foswiki::cfg{DBCacheContrib}{AlwaysUpdateCache}, $Foswiki::cfg{DBCacheContrib}{LoadFileLimit} and $Foswiki::cfg{DBCacheContrib}{Archivist} [08:17]
zak256$Foswiki::cfg{DBCacheContrib}{AlwaysUpdateCache} = 0;
$Foswiki::cfg{DBCacheContrib}{Archivist} = 'Foswiki::Contrib::DBCacheContrib::Archivist::Sereal';
$Foswiki::cfg{DBCacheContrib}{LoadFileLimit} = '0';
$Foswiki::cfg{DBCachePlugin}{MemoryCache} = 1;
$Foswiki::cfg{DBCachePlugin}{UseUploadHandler} = 0;
[08:17]
FoswikiBothttps://trunk.foswiki.org/System/PerlDoc?module=Foswiki::Contrib::DBCacheContrib::Archivist::Sereal [08:17]
MichaelDaumokay. very good.
now, before any further testing. please shut down your web server & foswiki
we do some testing on the cmdline without the web interface interfering
[08:17]
zak256I just did that. Took some time, but apache is now stopped. [08:18]
MichaelDaumno fcgi processes hanging around?
please double check ;)
kill them on sight
[08:18]
zak256no results for: ps -ef | egrep -i "fcgi|http" [08:19]
MichaelDaumany cronjobs that could be called in the meantime?
or iwatch daemon ... you name it
[08:19]
zak256no, none. it's just a testserver exclusively for the wiki [08:20]
MichaelDaumthen you are all set to rebuild the dbcache for all webs on the command line... [08:20]
zak256okay... [08:21]
MichaelDaumfirst delete the old one by deleting the working/work_areas/DBCacheContrib directory completely [08:21]
zak256done [08:21]
MichaelDaumnow go to <foswiki-dir>/bin [08:21]
zak256okay [08:22]
MichaelDaum./rest /DBCachePlugin/updateCache
this may take a while ...
any errors on the console?
[08:22]
zak256Okay... now I get the same messages I got previously in my apache error log (TRACE=1) is still set... [08:23]
MichaelDaumah the cancel the process, disable TRACE, delete the dir again and restart [08:23]
zak256Nothing that looks like an error for now [08:23]
MichaelDaumjust to make sure only error msgs are on the console
btw how many webs n topics have you got?
and one important thing I've forgotten: execute the rest call as www-data / httpd / www / apache user (depending on your distro)
make sure that each file and every directory is read+writeable by www-data
[08:24]
zak256apart from Main, Sandbox, System and Trash we have 4 more webs.
yes I executed the command as our www user which runs the foswiki
[08:25]
MichaelDaumthumb up [08:25]
zak256only one of those webs has a lot of pages, the other one's don't [08:26]
MichaelDaumnow let the thing update its cache and capture STDERR to some file for later inspection [08:26]
zak256ls -l *.txt | wc -l tells me 25373 pages [08:26]
MichaelDaumget a fresh coffee in the meantime ... like I am doing now :) [08:26]
zak256:-) good idea. the process is still running [08:26]
MichaelDaumwhile running ./rest /DBCachePlugin/updateCache [08:26]
zak256yes
Is Archivist::Sereal the best setting for our setup?
[08:27]
MichaelDaumare these pages all user generated content or automatically converted or generated using some external tool?
yes sereal is the best archivist
btw did you upgrade to the latest versions of DBCacheContrib & DBCachePlugin?
[08:28]
zak256I downloaded and installed in last friday, so it should be the latest version
the pages are both: we have user created topics which often include other pages with searches
these included pages then provide some kind of info box which displays contents of form fields and so on
I know this is probably better to implement as SkinTemplates. It's on my TODO list...
[08:30]
MichaelDaumwhat's the avg loading time of a page on your system?
probably best to only test this once updateCache finished
[08:32]
zak256it depends... when there are no searches involved its below 1 second or so
with bigger searches it goes to 30seconds and above... but that's without _any_ optimizations
[08:32]
MichaelDaumlike System.WebHome as a baseline ... (after removing tips contrib from it) [08:33]
zak256WebHome doesn't have a search and is about 1 second
Ahh... the process finished.
No output was given
[08:33]
MichaelDaumSystem.WebHome is at 1second? [08:34]
zak256Yes, last time I checked. [08:34]
MichaelDaumplease remove <div class="tipsOfTheDay">%INCLUDE{ "TipsOfTheDayInclude" warn="off" }%</div> and test again, please
just to get an idea of the baseline, y know
30 seconds for a %SEARCH page is probably not well received by your user base, is it.
[08:34]
zak256It is already a modified version of the default webhome
it has no INCLUDE macro
[08:35]
MichaelDaumk [08:36]
zak256and no search [08:36]
MichaelDaumwell
so that's as fast as it could ever get then
[08:36]
zak256and yes, 30seconds is unacceptable, but this is the new wiki. in our old we have caches which were even modified by someone...
yes I guess that's about the optimum
[08:36]
MichaelDauma well tuned foswiki (with tips of the day removed) should return System.WebHome at about 400ms
filesystem on an ssd
alright. now that updateCache has computed all caches for all webs, will they be updated incrementally from now on.
[08:37]
zak256ok, I have installed a cronjob which curls two pages and it says 0.24-0.31 seconds for webhome. [08:39]
MichaelDaumyay [08:39]
zak256it is installed on the same machine though
(right now)
[08:39]
MichaelDaumnow time to restart the engines
the first time an fcgi process sees the lights it will read the dbcache into memory. that's why the first click might be slower than any further one.
[08:40]
zak256ok... just cleared the logs and started httpd [08:41]
MichaelDaumyou will notice a slight delay depending on the life time your fcgi processes are configured in the apache configs
such as "kill an fcgi after it served 100 pages" ...
a new foswiki.fcgi will be started for it filling that slot and will have to re-read the dbache of a web as well
this is when it creates a certain slowdown: the price you pay for faster %DBQUERYs afterwards
[08:41]
zak256I set FcgidIOTimeout 300 and FcgidMaxRequestLen 4194304 in fcgid.conf
Other values should be default
Any recommendations for further tweaking there?
[08:43]
MichaelDaumbasically a foswiki.fcgi should be fine to run as many requests as it can. I never limit the number of requests a backend process can serve. Better is to kill it off when it grew to a certain memory footprint ... depending on the amount of content you have. [08:44]
zak256My topic with a lot of searches takes still some time to load... [08:44]
MichaelDaumof course this also depends on your server's memory resources and number of concurrent foswiki.fcgi processes loaded at the same time
note that %SEARCH is NOT optimized by DBCacheContrib/Plugin
you will have to replace it with a %DBQUERY
[08:45]
zak256Ah, okay. So I have to replace ... yes
But I guess I cannot replace every SEARCH just with DBQUERY, can I?
[08:45]
MichaelDaumthere are differences in the parameters and query syntax. [08:46]
zak256Ok, I have to read about that in the docs then... and change a lot of searches...
Can/should I activate the default foswiki cache as well in addition to DBCache?
[08:46]
MichaelDaumthese are two different caches [08:47]
zak256I know [08:47]
MichaelDaumone is caching html pages
the oth...okay
[08:47]
zak256Just thinkinf if it makes sense [08:47]
MichaelDaumsure does [08:47]
zak256And the manual cache update I did earlier in the CLI... maybe I can/should setup a cronjob to clean the cache, i.e. rm -f working/DBCache... and call it again every night or so? [08:49]
MichaelDaumI recommend to enable page caching in LocalSite.cfg, disable it again in SitePreferences ... and ONLY enable it on a per topic decision, such as those long running search pages and the like ... the frontpage everybody hits in the morning, WebRss, etc [08:49]
zak256Okay... yes, that sounds reasonable. [08:49]
MichaelDaumthe rest call to updateCache is ONLY required the first time dbcache is installed and when your content changed underneath using an external script or the like
there is no need to rm -f the cache dir anymore
[08:50]
zak256Ah... yes! Good to know.... We have several scripts that update topics externally... :-/ [08:50]
MichaelDaumI guess you are all set to start experimenting with %DBQUERY [08:51]
zak256What happens with these pages then? Content is not shown then I suppose? [08:51]
MichaelDaumwhich pages? [08:51]
zak256So we have a page with some content that is cached fine for example... [08:52]
MichaelDaumy [08:52]
zak256...now some script runs and inserts or changes some additional text into that page.
outside of foswiki!
if then someone reloads that page. what happens?
I guess the added/changed content is not shown then?
[08:52]
MichaelDaumyou will need this in your cronjob (1) run your script (2) ./rest /DBCachePlugin/updateCache (3) ./view refresh=all
refresh=all will empty the page cache
[08:53]
zak256but that doesn't reload *all* pages then?
I mean does it have some kind of notable impact?
[08:54]
MichaelDaum(2) will find out about changes by itself [08:54]
zak256Ah... (3) is just for the html-cache? [08:54]
MichaelDaumexcatly, it tosses the page cache [08:55]
zak256so if the page which gets updated is not included for page caching I can skip this step [08:55]
MichaelDaumyes [08:55]
zak256Great!
I guess for now all my questions are answered!
Thank you very much!
[08:55]
MichaelDaumy'welcome. [08:56]
zak256I will now do some reading and testing with DBCACHE [08:56]
MichaelDaumhave fun [08:56]
zak256s/DBCACHE/DBQUERY [08:56]
.... (idle for 15mn)
Can I run the ./rest command with full path without changing into the bin directory?
And I am just thinking if it makes sense to just execute ./rest /DBCachePlugin/updateCache automatically every hour or so instead of updating all scripts. Or wouldn't you recommend it?
Or maybe once every morning.
[09:11]
........ (idle for 37mn)
MichaelDaumyou do need to run ./rest in this directory as it loads its config from there
running updateCache every hour seems to be a waste of resources to me. don't you have access to the content generator scripts?
[09:52]
zak256I setup bin/LocalLib.cfg to specifically point to our lib dir, so maybe this will suffice?
and yes, I have access to the scripts, but it was just a thought to catch everything with that. I don't know how expensive the updateCache command is.
and of course... if there is a script which lets say is called 100 times for 100 topics I only want to updateCache once afterwards and not 100 times after each script run
But I still have to look into all those scripts, I don't really know what dark magic I will uncover there.
[10:03]
MichaelDaum:) [10:14]
zak256cthulhu ftagn [10:15]
........ (idle for 37mn)
MichaelDaum: Can't I provide wildcard matches for the DBQUERY topic parameter? [10:52]
MichaelDaumhave the wildcard in the query instead [10:54]
zak256Ah... it seems I need to use 'include=MyTopics*' [10:54]
MichaelDaum%DBQUERY{".... AND topic=~'regex'" ...}%
ya right
better is to let the query term itself weed out the unwanted results
otherwise you first generate a large hit set and then, while iterating over it, skip results via the include and exclude params
[10:54]
zak256Hmm... I would have guessed it's processed the other way round. [10:56]
MichaelDaumthat's what the topics param is for: it only queries those topics ... and not all of the web [10:56]
zak256Yes, but we have topics with numbers afterwards, like Item1234, Item1235, and so forth. [10:57]
FoswikiBothttps://foswiki.org/Tasks/Item1234 [ Item1234: MathModePlugin's latex2img not executable ] https://foswiki.org/Tasks/Item1235 [ Item1235: ActivePerl on Windows 2003 IIS 6.0 ISAPI ] [10:57]
zak256So I would just like topic="Item*"
and then process all those topics
[10:57]
MichaelDaum... or have this in your query topic=~'Item\d+' [10:58]
zak256This will not get all the topics beforehand?
ok... wil try
[10:58]
MichaelDaumany topics="match expression" parameter will still have to iterate over all of the web in order to collect those that match the expression ... which basically defeats their actual purpose of _not_ iterating over all of the web .
so best you can do is just iterate once over all topics and check the query term
[11:00]
zak256Ah of course... it's the database now. I still think in filesystem terms. [11:01]
MichaelDaumonly a strict list such as topics="Item1, Item2, Item2, ..." would save you from iterating over all of the web [11:02]
FoswikiBothttps://foswiki.org/Tasks/Item1 [ Item1: set up mailercontrib crons ] https://foswiki.org/Tasks/Item2 [ Item2: set up nextwiki-svn mailing list ] [11:02]
zak256ok I understand [11:03]
........ (idle for 37mn)
Can I provide the order of the values for a sort formfield parameter somehow? [11:40]
MichaelDaumyou mean sort the values of a multi value select formfield? [11:41]
zak256For example a formfield "Status" which can have values "active, archived, out-of-use".
And I want to have all active and archived shown first, then out-of-use last.
[11:42]
MichaelDaumjust sort by Status [11:43]
zak256ok [11:43]
...... (idle for 25mn)
***ChanServ sets mode: +o gac410 [12:08]
...... (idle for 28mn)
zak256MichaelDaum: Is is on purpose and supported that the syntax formfield~'*foobar*' works as well in a search clause, or is that just by accident? I cannot see a documentation for this here.
This works for topic~'Item*' as well.
[12:36]
gac410Hi all ... reminder Release Meeting ... starts in 15 minutes - Channel #foswiki-release Everyone welcome [12:47]
MichaelDaumI see work on the translation infrastructure: is http://translate.foswiki.org/ hosted by us now? [12:49]
gac410Hi MichaelDaum Not yet [12:55]
MichaelDaumah ok [12:56]
gac410we need to get a backup of the database and restore it on f.o ... unless uebera|| had some progress this weekend [12:56]
MichaelDaumawesome [12:56]
gac410Release meeting starting now in #foswiki-release [13:02]
........................................ (idle for 3h19mn)
***ChanServ sets mode: +o Lynnwood [16:21]
............ (idle for 55mn)
foswiki_irc9hello
i have a quick question, i migrated my foswiki installation and I can access the main site but nothing else. in the apache error log i always get AH02811: script not found or unable to stat: /var/www/foswiki/bin/viewview,
has anyone seen that before and has some tips, please? would be great :)!
[17:16]
...... (idle for 29mn)
nevermind fixed it myself, thanks anyway. have a great evening [17:46]
....................................... (idle for 3h12mn)
***ChanServ sets mode: +o Lynnwood__ [20:58]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)