#foswiki 2016-04-21,Thu

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
GithubBot[distro] gac410 pushed 1 new commit to master: https://git.io/vwcjy
distro/master 9f624de George Clark: Item13831: Language selector broken...
[00:19]
***GithubBot has left [00:19]
FoswikiBothttps://foswiki.org/Tasks/Item13831 [ Item13831: JS error in !System.LanguageSelector ] [00:19]
......... (idle for 44mn)
GithubBot[distro] gac410 pushed 1 new commit to master: https://git.io/vwCJa
distro/master 4aac8f3 George Clark: Merge branch 'Release02x01'
[01:03]
***GithubBot has left [01:03]
....... (idle for 30mn)
ChanServ sets mode: +o Lynnwood
ChanServ sets mode: +o Lynnwood
[01:33]
............................................................ (idle for 4h59mn)
ChanServ sets mode: +o CDot [06:34]
........... (idle for 52mn)
ChanServ sets mode: +o MichaelDaum [07:26]
......................... (idle for 2h2mn)
LorandGHello I have a question, is there a way to restrict access and edit function to the group Topic in Main? We are using sso with kerberos and now everyone can set groups as they want which is not very nice. Thank you [09:28]
....................... (idle for 1h53mn)
No answer or help? [11:21]
VilleSI'm trying to copy files with bulk_copy.pl but it is soooo slow with some topics / webs. Is there anything I could do to make it work faster? I've already put the process priority up.
It will take me weeks to go through the copy with this pace.
I'm trying to copy files with bulk_copy.pl but it is soooo slow with some topics / webs. Is there anything I could do to make it work faster? I've already put the process priority up.
[11:25]
JulianLevensLorandG: Do your group topics have their own ALLOWTOPICCHANGE preference?
VilleS: How many webs/topics do you need to convert?
VilleS: Do you have topics with a lot of history?
[11:27]
VilleSMaybe hundred of webs on thousands of topics. One topic copy can take like five minutes even they are short ones. Lots of histories yes.
I mean one version of topic can take five minutes.
Some go fast
There must be something I can do...
[11:29]
JulianLevensHistory is probably the issue: what is the source and destination Stores? i.e. RcsLite, RcsWrap, PlainFile? [11:31]
VilleSRcsWrap to PlainFile [11:32]
JulianLevensOk, you could change RcsWrap to RcsLite, but I'm pretty sure RcsWrap is the fastest choice, so that won't help
You could test that theory of course, I'm not certain
As history is likely to be the problem, you could use the command line option to list topics to only copy the latest. That's done with WebStatistics topic
[11:37]
VilleSDoes the bulk_copy.pl copy the topics diffing from the latest version to Version 1? And if the topic has many versions it takes a while to diff to the version 60 to version 1? Or how does the rcs work? [11:39]
JulianLevensOf course, that's only ok if you can sacrifice some history on some topics
rcs stores differences between version and that takes quite a bit of processing
One of the key benefits of PlainFile is each version is kept as a separate file, so no need to 'calculate' what a revision looks like
Not that that help you right now
s/help/helps/
[11:40]
VilleSThen the bulk_copy.pl could be much faster if it went from version 60 to version 1 with some caching. And not calculate from top to bottom every time. I guess now it calculates all topics from 60 to 1, 60 to 2, 60 to 3, 60 to 4 etc.
Is it how it behaves now?
[11:45]
JulianLevensProbably, I was thinking the same thing. Caching would be possible to add to RcsLite not RcsWrap which depends on the linux rcs command, which just checking the manual can only extract one version at a time
As RcsLite is pure perl that decodes the rcs format files, something like that might be possible. However the current Store API doesn't support an extract all versions operation
Although it might be possible behind the scenes
Still difficult to make sure it only happens in 'bulk' mode. In normal web mode one version at a time is preferable
[11:48]
VilleSI don't see rcs process running...
perl is at 100%
[11:56]
JulianLevensDouble check the LocalSite.cfg of your source configuration, maybe it is RcsLite after all [11:58]
VilleSIt says RcsWrap
$Foswiki::cfg{Store}{Implementation} = 'Foswiki::Store::RcsWrap';
[11:59]
FoswikiBothttps://trunk.foswiki.org/System/PerlDoc?module=Foswiki::Store::RcsWrap [12:00]
JulianLevensThen you should see rcs appear as a process from time to time
If perl is at 100% it's possible that rcs is not the culprit but something else
which perl version are you using
[12:02]
VilleSperl -v This is perl 5, version 16, subversion 3 (v5.16.3) built for x86_64-linux-thread-multi (with 29 registered patches, see perl -V for more detail) [12:03]
JulianLevensperl 5.16 may have performance issues, esp wrt Unicode and regexes; 5.20 and later fix those not sure that applies here [12:04]
VilleSits CentOs 7 [12:05]
JulianLevensYou can use things like plenv to have a local and up to date perl. Some people recommend that set-up
I plan to get that working on some of my stuff, I haven't tried yet
Sorry, but I don't think a trivial solution is available to speed things up
[12:06]
VilleSIs it normal for the bulk_copy.pl to split into two processes? [12:19]
in debugger i see it is spending time in /usr/share/perl5/vendor_perl/JSON/PP.pm:745
if($ch eq $boundChar){
[12:25]
JulianLevensyes, it splits into two process: one for the source foswiki and one for the dest [12:26]
VilleSShould I have a package called perl-JSON-PP? JSON::PP is a pure-Perl module and is compatible with JSON::XS [12:27]
JulianLevensAhh, can you install JSON::XX
Ahh, can you install JSON::XS
You need that for performance
JSON::PP works but is slower
[12:27]
VilleSHahaa, now we are talking :-) It owrks
works
Thanks
[12:29]
JulianLevensyw [12:31]
...... (idle for 29mn)
***ChanServ sets mode: +o Lynnwood [13:00]
..... (idle for 23mn)
ChanServ sets mode: +o Lynnwood__ [13:23]
........ (idle for 36mn)
ChanServ sets mode: +o Lynnwood [13:59]
..................................... (idle for 3h1mn)
ChanServ sets mode: +o Lynnwood__ [17:00]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)