#foswiki 2017-06-03,Sat

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
GithubBot[distro] vrurg pushed 1 new commit to Item14237: https://git.io/vHgQx
distro/Item14237 47dc0ef Vadim Belman: Item14237: Completed documenting Foswiki::Config...
[02:36]
***GithubBot has left [02:36]
FoswikiBothttps://foswiki.org/Tasks/Item14237 [ Item14237: Implement Development.OOConfigSpecsFormat proposal ]
https://trunk.foswiki.org/System/PerlDoc?module=Foswiki::Config
[02:36]
............................. (idle for 2h24mn)
GithubBot[distro] gac410 tagged FoswikiRelease02x01x04 at master: https://git.io/vHRUS [05:00]
***GithubBot has left [05:00]
............................................................................. (idle for 6h23mn)
daemonhmm here is a random one for you guys, I want a wikipage called 'Equipment' but it won't let me do that because well it should be a WEB with only one capital ... the web it is a part of is BSDM so ideally BSDM/Equipment and its a basic run down of what equipment I am using for the show; what WikiWord can you think of to describe the page? MySetup came to mind, not sure though [11:23]
.... (idle for 15mn)
jastdaemon: you can usually create a topic that isn't a wikiword (like 'Equipment') if you check the 'allow non-wikiword' checkbox in the "create new topic" dialog, or just go to .../bin/edit/BSDM/Equipment [11:38]
daemonjast, yeah I did not want to do that, I kinda like everything being what it should be :) [11:38]
jastit's more a matter of preference than anything else... I've set up a lot of wikis with lots of non-wikiword topics
UsedEquipment?
[11:39]
daemonEquipmentSetup
StreamHW
[11:39]
jastwell, I guess "used" could mean two things... :} [11:40]
daemonYeah lol
going to be interesting to see how this turns out
just sent a message to one of the bsd tv guys
asking if using 'BSDM' for the series name is a bit to dodgier pun
as it looks like a typo of something else and I am 99% sure most people are going to re-read it
[11:40]
jastjast whistles innocently [11:41]
daemonlol
oh damn I forgot to contact you vrurg yesterday? was it
totally slipped my mind till just now
[11:41]
jastI don't know anything about that, wasn't here yesterday :) [11:44]
daemonah I purchased a domain a while ago
daedb.me
but a couple of people told me it is very close in sound to a pretty harsh insult in some country I forgot
vrurg promised to explain what to me if I send him a message the next day (which I forgot)
that that I mind that much because .me domains are literally pennies, but I am interested to know what it means or is close to meaning
[11:44]
jastlanguages are hard... [11:45]
daemonnot that I *
another friend of mine said its likely because if you remove the 'db' out of it, it is the name of a group of people that are not very nice
or nearly
[11:45]
jastby far the cheapest domains I can get are .de [11:46]
daemonI think .xyz is the cheapest for me [11:46]
jastEUR 3,48 per year [11:46]
daemonlerts have a look see
I know the most expensive I have is 'tmp.group'
like 30 euro I think
[11:47]
jastmy most expensive is jk.gs (that first part is my initials)
39 euro, but I'm going to move it to a different registrar next year where they charge 30
[11:48]
daemona 2 character domain is pretty rare
on any tld
[11:49]
jastI get .group for 17.88 euro [11:49]
daemonI think its because its a 3 char domain it has some 'premium' stamp on it [11:50]
jast(gross)
ah, yeah that sucks
[11:50]
daemonand from what I figure out that just means, we are going to charge you more because its short [11:50]
jast.ws sells one-letter domains for $50k/year [11:50]
***ChanServ sets mode: +o gac410 [11:50]
jasthi George [11:50]
daemonagternoon gac410
ah maybe not as bad as I thought
Final Cost : $12.48
[11:50]
jastthat's decent [11:51]
daemonpretty sure that was .group one [11:51]
gac410hi jast, daemon [11:52]
daemonthe invoice does not bloody say what domain it was for [11:52]
jastconvenient [11:52]
daemonalso decided on 'HardwareUsed'
makes more sense in the context
because I will reference each video series in it to whatever set of equipment it was
[11:52]
jastoh, yeah, that's good [11:53]
daemongac410, finnaly came up with the name of my freebsd video series; BSDM :D
you likey?
m is for media
[11:53]
gac410cute That will certainly get some "typo" based traffic [11:54]
daemonyes indeed my thought as well
now lets count how many times I cock it up in the actual videos and say what most people think it will be a typo of
[11:54]
gac410not that the typo-matches will have much interest in your videos. [11:55]
daemonwas'nt really for typo matches it was just a half dirty pun [11:55]
jastmy.network is available for registration, only EUR 322 per year! [11:55]
gac410Although with my experiences on bsd maybe not :D [11:55]
daemonI know a lady who is into all that stuff; even constructs things and she loves FreeBSD
so maybe I have I can spin a niche market
[11:56]
gac410ya never know
It could give you some incentives for some interesting puns along the way I suppose
But I'm not going there
[11:56]
daemonmy plan is to actually write the full scripts I am thinking about for the first few episodes; because I have never actually done video casting before (I done some dj'ing from time to time when younger)
I figured I could pass it out to some of the freebsd guys/gals I know
get some feedback before even attempting it
I also need to make something of an application FosWiki can use; perhaps you can advice; my 'idea' was that the first episode ever would be installing FreeBSD, 3 different installs, zfs root, ufs root (zfs hack/ufs root) maybe even installing freebsd onto a usb pen so it a 'live os'
but I want to include a link to my foswiki where users can register and then say what they would like a video about
you know so I know what people want
[11:57]
jasthave people submit a short blurb of text? you'll be wanting CommentPlugin or MetaCommentPlugin, probably [12:00]
daemonthat is the oen thing I wanted to avoid
in the increidble unlikely event this is popular
[12:00]
jastwhat aspect of it do you not want? :) [12:00]
daemonI wanted something to kinda 'group' stuff together so it can be a summary
#1 requested: How the hell do you use dtrace properly (302 users)
[12:00]
gac410I do know that if you accept registrations, you will start to get a lot of spam registrations, especially if the registration allows comments. [12:01]
jastso, you'd have people either request a new thing, or +1 an existing request?
(or multiple)
[12:01]
daemonjast, yes ideally when they stared typing 'new thing; it would do some regex hackery and auto bring up a list of currently proposed ideas [12:01]
jastright, there isn't something ready-made for that, I think [12:02]
gac410anyone else having trouble getting slashdot to render? I now get unstyled pages on firefox [12:02]
jastthe regex hackery and "auto bring up a list" could be done in javascript in the browser [12:02]
daemongac410, https://1drv.ms/i/s!Asj-dFOnBAsJjFXXLZ1znrt9Z-oM seems happy enough here [12:02]
jastslashdot looks normal to me [12:02]
daemonI think I will make that actual 'form' for the requests to require two items, [Software name] and [request], because then I can simply create a 2 stage tree and the 'hints' can just bring up other requests for the software_type
software_name may be stronger
pciconf, dtrace, strace, openoffice
w/e
[12:03]
gac410I use "Privacy Badger" and "Noscript" even with noscript disabled it won't render. Privacy Badger has found potential tracking cookies now. [12:05]
daemongac410, I use ublock origin 9 domains blocked on apge
page*
[12:05]
jast16 things blocked, page still renders fine
(from 10 domains)
[12:06]
daemonnever expected so much crap from slashdot [12:07]
gac410yeah it's getting pretty bad [12:07]
daemonbail http://www.theregister.co.uk :)
oooh I bet latest BOFH article is out as well
talking about adverts
https://www.theregister.co.uk/2017/06/02/google_6_months_to_publishers/
[12:07]
................... (idle for 1h31mn)
gac410, I think I found another bug but I am not sure if my browser is creating the proble with cache
oh cool
no its a bug with foswiki
different browser shows the same behaviour
[13:42]
gac410what's it doing? [13:44]
daemonif you type something in like .... DOMAIN.TLD/MyNewWeb
it will of course thing its a wikiword and goto the right page
but if you then go ... wait a minute actually I wanted to make a Web
so you change the url to
DOMAIN.TLD/mynewWeb
it treats it as a wikiword instead of a web
brings up the wrong form
[13:44]
gac410We auto-capitialize the first letter of Web or Topic ... To create a new web you have to use System.ManagingWeb. Not an automatic process like a new topic. [13:46]
daemonahh ok [13:46]
gac410Usually webs are pretty static - not a lot of creation happening on sites. [13:46]
daemonI am not recommended to use BSDM -_-
so I changed it over to viBSD
lol
[13:47]
gac410Maybe there is a corner case there. TLD/Someweb asks to create Main/Someweb. Strange. It should probably return access denied - Web "Someweb" does not exist.
I guess somewhere along the line someone was trying to be helpful, and look for a topic in the default web when the url is ambiguous
[13:50]
daemonah sweet
you actually cannot create 'viBSD' even using system.manage
[13:52]
gac410Actually that's already fixed in foswiki 2.2 alpha. We have a new parser that a bit more strict. [13:52]
daemonAttention
"viBSD" is an invalid name for a new web
You are recommended to choose short names, preferably less than 10 characters, starting with an upper-case alphabetic character and using only alphanumeric characters. If you want to create a template web (a web just used as a base to create new webs) choose a name that starts with an underscore and has only alphanumeric characters.
[13:52]
gac410Right It must start with an upper case.
I can't answer why. Always been that way afaik
[13:52]
daemon:)
mhmm
I canot create 'Vibsd' either
its trying to create it as a topic
[13:53]
gac410really ??? lemme try [13:55]
daemonTopic 'Vibsd' does not exist
Did you spell the topic name correctly? Remember, a topic name is case sensitive.
[13:55]
gac410worked fine here on 2.1.4 ... System.ManagingWebs. Web name "Vibsd" took defaults, cliked "Create web" [13:56]
daemonlet me see if I can reproduce this with a totally different one [13:57]
gac410ViBSD worked fine too. [13:57]
daemonok this is odd
its making EVERYTHING a topic
https://wiki.tmp.group/Weto
wants to be a topic too
[13:57]
gac410You went to the form on https://wiki.tmp.group/System/ManagingWebs and tried to create web Weto ... and it redirected to that error?? [13:59]
daemonI just went to: https://wiki.tmp.group/Weto
because for creating webs that has always just worked
let me try going through system managingwebs
[13:59]
gac410I've never heard of that working. ManagingWebs is the only way to create a web. It's a very controlled process using the bin/manage script [14:00]
daemon99% sure thats how I made BSDM and HWMatrix
actually not 99% 100%
that is how I made those
instead of coming up as a 'topic' it came up with the create web form
[14:00]
gac410That is really really strange. I've never heard of that. [14:01]
daemonthere we go Vibsd
now time to start writing the first episode
gac410, you know what did change
I was on RC 1 or 2?
then I upgraded to release
[14:02]
gac410There were very few changes, certainly nothing that added/removed forms for creating a web.
I can checkout the RC1 tab and try it.
[14:04]
daemonI think soemhow I was on RC2 let me look at the bacup
from the root of the old server, what is the path to that file with the version in it
[14:04]
gac410Create a web is a very specific process. To my knowledge the only way to create a web is to use the binmanage script
/path/to/foswiki/lib/Foswiki.pm
[14:05]
daemonuse version 0.77; $VERSION = version->declare('v2.1.3_002'); [14:06]
gac410look for $RELEASE [14:06]
daemon$RELEASE = 'Foswiki-2.1.4-RC2'; [14:07]
gac410okay... checking [14:07]
daemonif this comes back negative, I am checking into a psychiatrists [14:07]
gac410Okay. If you visit Somemissingweb/SomeTopic. It gives an oops message for Web does not exist. That message has a link for the System.ManagingWebs page with the webname pre-filled [14:09]
daemonahh [14:09]
gac410If you go to Somemissingweb without the topic, then it tries to be helpful and looks for a topic of that name in the Main web. [14:09]
daemonthat must be it
I knew it was different somehow
[14:09]
gac410That last behavior is removed by the url parser in 2.2-alpha [14:10]
daemonleast I am not as crazy as I was starting to think I was [14:10]
............ (idle for 55mn)
***ChanServ sets mode: +o Lynnwood [15:05]
LynnwoodSeveral sites I manage have had this spam registration recently: BatikSlimfit. Googling this name +foswiki shows he's registered on foswiki installations all over the place. [15:18]
I seem this before where a spam user starts showing up on lots of Foswiki servers. Makes me wonder how hard it would be to add a feature to AntiWikiSpamPlugin to generate our own list of banned users. [15:27]
gac410It's a constant flood of made-up names. They come in waves. I've taken to also banning the subnets. Last one I banned, then fiipped to ipv6 and registered.
Typically gmail addresses, and also waves from a webgarden domain.
If you are going to accept registrations, probably the best solution now is to enable registration approvals.
[15:28]
There are somewhere around 210 removed/spam users in our trash - though that is for the last year or so. Not sure when we last cleaned the trash [15:39]
.......... (idle for 49mn)
***ChanServ sets mode: +o Lynnwood [16:28]
daemongac410, Lynnwood an interesting one would be creating a fake foswiki
infact 50 of them
cheap domains or site1.etc.etc
site2.etc.etc
and generate the global blacklist on those that register on it
kinda like a honeypot
would not have to be a particularly big or powerful server
from there an optional plugin on foswiki installs that checks the list periodically
whats more you could get a little help off your userbase with it
ask them to point spare domains or spare subdomaisn at it
[16:41]
gac410I think that there are some sites like that already. It's not just foswiki that's the target. a couple of years ago oholo had so many registrations they weren't tracking it took down their databases. [16:43]
daemonkinda glad all my registrations require me manually to accept them [16:44]
gac410I came across one recently, can't remember the name of it.
Yeah, tbh that's the best solution.
[16:44]
daemonwhat do the bots do when they get on anyway
create a shit load of /StupidPages with links to viagrea
etc
[16:45]
gac410A lot is just link farming. Just register with web site url in their form. Some save a bunch of text with links, but that's rarer.
Essay writers is a popular one.
[16:46]
daemonah ok [16:48]
gac410We use AntiWikiSpamPlugin to watch for spam text. Sometimes a bit too aggressive. It blocks youtube It uses a spam collection managed by moinmoin
But tbh, most of the link farmers slip through. I review all registrations daiily, and am pretty aggressive in deleting them.
[16:49]
daemonI never really dealt with a platform that deals with spam
I hope non of the various projects I am undertaking generate a massive interest so it does become a problem
lol
[16:50]
gac410Doesn't need to be a lot of interest. If they find a site that accepts registrations, they go for it, wiki or not [16:51]
daemonI bet mine has not got noticed by google and friends yet
infact I do not know if it even has a robots.txt
[16:52]
gac410we ship a default one [16:53]
daemoncool
just looked at it
crawl-delay: 5 ?
is that days
months
[16:54]
gac410I think it's in seconds. not sure [16:54]
daemonjust looking at up
'Google does not support the crawl delay command directly, but you can lower your crawl priority inside Google Webmaster Central.'
and its not in seconds or any time
its a 'priority'
you can set different priorities for different parts of your site
if the search engine implements it in its crawler
so a crawl-delay that is global like it is in your robots.txt is actually meaningless
if this spec is right
[16:55]
gac410well I guess each bot is up to it's own interpretation. According to wikipedia, Yandex: seconds between visits Bing: time window during which there is only one request. [16:57]
daemonlovely when we have a nice solid spec that everyone follows [16:58]
gac410Same problem with the "nofollow" directive which many bots interpret as "okay to follow, just don't index" So links that trigger searches can kill your site. [16:58]
daemonhmmm
would you be willing to give me a copy of a heavy foswiki's access log (excluding ip information)
[16:58]
gac410We had to add "%TABLE{sort="off"}% to most tables on foswiki.org to stop the bots from cliking every sort link in every direction. [16:59]
daemonI just wonder if with a big enough sample set, if 'bots' set a static useragent
because if they do we can catch most of them
from that we could then decline certain intesive resources
[16:59]
gac410agent strings to the legit bots are pretty well known. But the bad ones often falsify agent strings. [17:00]
daemonI imagine the legit ones add enough load on there own
might be enough to help
not particularly difficult to implement either
[17:01]
gac410yeah. the bad ones though can bring sites down. One site I help had ONE bot, hitting every sort link on only one large topic, from multiple IPs every second or so They were going to upgrade their server ... until I blocked the agent.
SemrushBot
[17:02]
daemonhmm
gives me an idea on howto guard against it
might try write a module
intersting
looking at my access log
I have some weird accesses
holy crap
104 different ips have visited my site
apparently atm SAMSUNG SM-N910C
someones phone is
66.249.65.247 - - [03/May/2017:08:31:19 +0000] "GET /robots.txt HTTP/1.1" 404 143 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
that did not take them lomh
wonder why robots is 404
[17:03]
gac410There is also BlackListPlugin which is very old and takes a much more aggressive approach - blocks IP's that access too frequently, or update too much. [17:07]
daemongac410, that was what I was going to try as well as another little trick
generate a hidden link in all rendered pages
if the link is triggered by a high accessing client
ban
[17:07]
gac410Hm the web server config needs an explicit alias for /robots.txt or it will get picked up by the rewrite for short URLs. [17:08]
daemonseems to work when I try it as a normal user
https://wiki.tmp.group/robots.txt
[03/Jun/2017:17:09:28 +0000] "GET /robots.txt HTTP/1.1" 304 0 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.114 Safari/537.36 Vivaldi/1.9.818.50"
[17:08]
gac410Alias /robots.txt "/var/www/foswiki.org/robots.txt" or the like in the config. [17:09]
daemonhold on though
why can I access it
[17:09]
gac410worked fine here too. [17:10]
daemonok this is really weird
oh f££k
https://paste.ee/p/SlW0C#ayHA5JKLg7EJIvgPwI9e6XnMGznH2bQB
look at the 404 one
and the user-agent
[17:10]
gac410Yes. We have SemrushBot in the BrowserMatch rules - they will always get a 404 [17:11]
daemonah
157.55.39.144 - - [30/May/2017:19:37:49 +0000] "GET /robots.txt HTTP/1.1" 404 143 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
5.255.251.17 - - [01/Jun/2017:15:33:08 +0000] "GET /robots.txt HTTP/1.1" 404 169 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)"
46.229.164.102 - - [02/Jun/2017:18:28:46 +0000] "GET /robots.txt HTTP/1.1" 404 143 "-" "Mozilla/5.0 (compatible; SemrushBot/1.2~bl; +http://www.semrush.com/bot.html)"
take it if it has 'bot' its 404
lol
ooh thats a new one
162.210.196.130 - - [31/May/2017:02:33:30 +0000] "GET /robots.txt HTTP/1.1" 404 169 "-" "Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)"
almost every 404 is a bot
of some kind
[17:11]
gac410It is not supposed to be all bots. That's not good
Are you using apache or nginx
[17:13]
daemonnginx is my 'reverse proxy'
apache is my server
not that it makes any difference really
the apache one won't have any relative ip
except the reserve proxy's internal ip
but nginx's logs are nicer more detail
[17:13]
gac410okay... So you are showing the nginx log, not the apache log? [17:14]
daemonyep
there pretty much the same or should be
"DomainCrawler/3.0 (info@domaincrawler.com; http://www.domaincrawler.com/online-restaurant.de)"
how many bloody index bots are there
[17:14]
gac4101000's of them [17:15]
daemonuptimebot apparently visited me too [17:15]
gac410some are email harvesters, SEO optimizers looking for ad placements, etc.
helps to Block your System web to the WikiGuest user.
[17:15]
daemonthat is bloody sneaky check this [17:16]
gac410Less to index [17:16]
daemon[31/May/2017:02:33:30 +0000] "GET /robots.txt HTTP/1.1" 404 169 "-" "Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)"
set the user-agent as part of the actual browser headers
but really its saying its firefox or so
one of them sent no useragent at all
incredible
[17:16]
gac410yup. Agent strings are close to useless. The "legitimate" bots use them correctly. Everyone else - all bets are off. Plugins on FF let you set whatever agent string you want. [17:17]
daemonyou know when you was telling me about the trouble bots caused indexing etc
I presumed 1 bot just going mad 100s of queries a second
did not expect a swarm of different ones
[17:18]
gac410It's swarms that come & go, some following a sane crawl delay, others using distributed IP addresses with nearly no delay. It's really painful.
And it costs Real $$ in cpu cycles, especially if you use a hosting site that restricts or bills for use
[17:19]
daemonyeah looking at this google and bing seem to be relatively well behaved
what I would love to know is the date on some of the earlier ones
they literally hit the site while I was in here talking to you about setting it up
within 1hr of it
[17:20]
gac410yup.
Another thing you can do is to "Disable guest sessions" in configure. That saves the overhead of creating a new session file for every bot hit.
[17:20]
daemonI do not have to worry to much, the server this is running on is VERY bored and very overpowered
in realism I am running my little foswiki almost dedicated on this:
CPU: Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz (3411.56-MHz K8-class CPU)
FreeBSD/SMP: Multiprocessor System Detected: 8 CPUs
real memory = 17179869184 (16384 MB)
[17:21]
gac410Not sure why you would be getting google bot blocked, our rules do block an impersonator but not the real bots idont think [17:23]
daemongac410, its not just google
its anything with /bot/ in it
[17:23]
gac410You should see messages in your apache log. "Blocked by server configuration" [17:23]
daemonlet me check the apache log instead of nginx [17:23]
gac410Ah if the agent string *begins* with bot, it will be blocked BrowserMatchNoCase ^bot blockAccess [17:24]
daemongac410, dood thats like almost all of them
well
thats not true
check this paste
[17:24]
gac410er... bing uses Mozilla/5.0 (compatible; bingbot/2.0; that should not match ^bot [17:25]
daemonhttps://paste.ee/p/sOjMh
apache's config
err apaches access
shame it does not show the useragents
but they are getting through at least
[17:26]
gac410You can change the log string to add that stuff. [17:27]
daemonnah if anything I might disable 'accesslog' on apache
and just seperate it on the nginx host ontop
no point logging things twice
[17:27]
gac410true [17:27]
daemonhey just as a curiosity does foswiki do anything with 'client ip' [17:30]
gac410Actually the "blocked" messages are logged to the Apache error log ... AH01797: client denied by server configuration:
It logs it. I have a change planned / merged into 2.2 that will use the x-forwarded-for header to establish the client ip
Some plugins like the old BlackListPlugin need it.
[17:30]
daemonproxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
was the 'recommended'
for a reverse proxy
I think X-Real-IP is pretty common as well
[17:31]
gac410hm X-Real-IP not that I had come across. What did I use,.. looking [17:31]
daemonhttps://distinctplace.com/2014/04/23/story-behind-x-forwarded-for-and-x-real-ip-headers/ [17:33]
gac410Item14380 [17:33]
FoswikiBothttps://foswiki.org/Tasks/Item14380 [ Item14380: Foswiki should have option to use X-Forwarded-For to determine Client IP in reverse proxy configuration. ] [17:33]
daemonthere both the exact same thing
but X-Real-IP seems to ahve turned up magically out of nowhere
[17:33]
gac410The X-Forwarded-For (XFF) HTTP header field is a common method for identifying the originating IP address of a client connecting to a web server through an HTTP proxy or load balancer. [17:34]
daemonok dokey man, catch you later [17:35]
........... (idle for 51mn)
***ChanServ sets mode: +o Lynnwood
ChanServ sets mode: +o Lynnwood
[18:26]
................... (idle for 1h30mn)
gac410daemon: https://tools.ietf.org/html/rfc7239 is the standard for proxy headers. X-Real-IP seems to be an nginx special. By the looks of the RFC, I probably have some work for 2.2 on how to fully support these headers.
hm actually no, this is defining a new header Not the X- extensions yeesh.
[20:00]
daemonit seems from what I read
both are commonly set
[20:13]
gac410In any event, the question becomes how CGI / Apache2::Request / FCGI present them to the environment. Hopefully we don't have to actually read the headers. [20:14]
daemongac410, the thing I found with the most detail for 'the difference' was this http://www.programering.com/a/MzMxADOwATM.html
it looks like X-Real-IP is a singular item 'the client ip'
where as X-forwarded-for can contain a list
[20:14]
gac410Yes, List where the first address in the list is the client address, and others are proxies.
But they get put into the ENV{} as HTTP_FORWARDED_FOR in my testing at least with Apache as the proxy. I saw that and also HTTP_FORWARDED_HOST
Thing to check is the System.FoswikiServerInformation That should dump all the %ENV{} variables.
[20:15]
daemonquite suprising we do not have a solid set standard for it by now [20:18]
gac410er... no... it's $ENV{HTTP_X_FORWARDED_FOR}% [20:18]
daemonconsidering how many reverse proxies and such there are [20:18]
gac410y indeed. CGI has a virtual_host() request which is supposed to figure out the forwarded host part.
Foswiki doesn't use it. VirtualHostingContrib does though, that's where I learned about that one.
Looking at the %ENV I can't find anything that reflects the original protocol - http or https. Though the RFC claims Forwarded-host
[20:19]
daemonI could make a header display it
X-Source-Protocol
or something
but I do not think its normal to pass them
[20:22]
gac410well, what we really need is the formula to make this work in generic proxy environments.
The problem is Foswiki has to generate the <base> and <href=> links for either http or https as used by the client, not the foswiki server.
Hence the ForceDefaultUrlHost needed when the real host ( determinable ) and protocol (indeterminant) has to be applied to generated links.
Or during bootstrap, when foswiki tries to detect it's protocol and url for the DefaultUrlHost setting.
[20:22]
daemon[Sat Jun 03 20:25:40.997318 2017] [fcgid:warn] [pid 65556] [client 2a01:4f8:150:64cd::1:100:28169] mod_fcgid: HTTP request length 135976 (so far) exceeds MaxRequestLen (131072), referer: https://wiki.tmp.group/bin/attach/Vibsd/HardwareUsed
uploading an image
max file size upload option somewhere?
[20:25]
gac410yup. There are 2 or 3 places. 1) It's a web preference, default is 10M 2) the web server might have a restriction, and 3) FCGI has one too :P [20:26]
daemonfcgid by the looks of it
that image is nowhere near 10M
[20:27]
gac410y.. It's got a very small default. Foswiki:Support.ApacheConfigGenerator has a field where you can define the desired size. [20:28]
FoswikiBothttps://foswiki.org/Support.ApacheConfigGenerator [ ApacheConfigGenerator ] [20:28]
gac410FcgidMaxRequestLen 10000000
Darn... I though I had added comments for the other related settings. I guess not.
[20:29]
daemonhttps://wiki.tmp.group/Vibsd/HardwareUsed
looking pretty
:)
[20:31]
gac410nginx also has client_max_body_size 10M;
and foswiki setting, search for ATTACHFILESIZELIMIT either in DefaultPreferences (don't modify), SitePreferences or WebPreferences.
[20:31]
daemongac410, that image is 160K
lol
[20:32]
gac410, can I override the 'foswiki' logo in a particular web? [20:41]
gac410I think so ... Really anything can be overridden right to the Topic level, unless set in the FINAL settings at a level
Foswiki:System.PatternSkinCustomization#Logo
[20:42]
FoswikiBothttps://foswiki.org/System.PatternSkinCustomization#Logo [ PatternSkinCustomization ] [20:43]
daemongac410, not sure if this is my eyes
ah I know why
nm :)
[20:49]
................. (idle for 1h23mn)
rouiljgac410, thinking about banning bots, I whitelist some bots and then use a set of rules that look at the Host: header and requires that it match a valid hostname for my server. If it's an IP address I ban the client.
Are your misbehaving bots using foswiki[dot]org as the host header or are they hitting it by the ip?
[22:13]
............. (idle for 1h0mn)
***ChanServ sets mode: +o Lynnwood__ [23:14]
........ (idle for 39mn)
gac410rouilj: I've never looked at the host headers. and damn, our default host responds as f.o to the IP address. [23:53]
It doesn't look like we log it, so we'd have to change the apache log format to know. [23:58]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)