問:申請的ssl證書綁定的是www的域名,https://www.dtjxpj.com/,但是現(xiàn)在只能訪問不帶www的,虛擬主機設(shè)置了ssl之后,無法打開
答:您好,目前測試訪問顯示,您打不開報什么錯呢,麻煩截圖瀏覽器報錯并ping www.dtjxpj.com然后截圖反饋在工單,以便我司協(xié)助
非常感謝您長期對我司的支持!
問:好像是緩存問題,另外怎么強制https呢,因為這個域名http上一任的持有人似乎用于不法用途被舉報的次數(shù)太多,打開每次都會提示安全風險,所以現(xiàn)在加了https,但是怎么訪問域名還是會先在http
答:您好,查看到已經(jīng)添加了301,http跳轉(zhuǎn)到https,當前測試可以正常訪問。
問:您好,請問如何屏蔽除百度、搜狗、谷歌等主流搜索以外的引擎呢?這個幫忙的代碼是否就是不屏蔽百度谷歌等搜索引擎呢?http://www.shinetop.cn/faq/list.asp?unid=820<rule name="Block spider"> <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> <conditions> <add input="{HTTP_USER_AGENT}" pattern="SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" /> </conditions> <action type="AbortRequest" /></rule>我想要保留百度、搜索、谷歌等爬蟲,其他的爬蟲需要屏蔽,請問該如何修改代碼?謝謝您!我還有過濾Ip、域名以及301https,我這樣插入屏蔽非主流爬蟲的代碼正確嗎?<configuration> <system.webServer> <rewrite> <rules> <rule name="Block website access"> <match url="^(.*)$" ignoreCase="false" /> <conditions logicalGrouping="MatchAny"> <add input="{HTTP_REFERER}" pattern="www.123.com" /> <add input="{HTTP_REFERER}" pattern="www.111.org" /> <add input="{HTTP_REFERER}" pattern="www.12.cn" /> </conditions> <action type="AbortRequest" /> </rule> <rule name="Block spider"> <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> <conditions> <add input="{HTTP_USER_AGENT}" pattern="SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" /> </conditions> <action type="AbortRequest" /> </rule> <rule name="Block website ip" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAny"> <add input="%{HTTP_X_FORWARDED_FOR}&%{REMOTE_ADDR}&%{HTTP_X_Real_IP}" pattern="(127.0.0.1|127.0.0.1)" /> </conditions> <action type="AbortRequest" /> </rule> <rule name="301" stopProcessing="true"> <match url="^(.*)$" ignoreCase="false" /> <conditions logicalGrouping="MatchAll"> <add input="{HTTP_FROM_HTTPS}" pattern="^on$" negate="true" /> </conditions> <action type="Redirect" url="https://www.abc.com/{R:1}" redirectType="Permanent" /> </rule> </rules> </rewrite> </system.webServer> </configuration>
答:您好,是的,只需要屏蔽的時候不寫這幾個您想保留的蜘蛛,就不會被屏蔽,另您寫入代碼的位置正確
非常感謝您長期對我司的支持!
問:您好,我使用觀察一下,非常感謝您的幫助!
答:您好,不客氣,非常感謝您長期對我司的支持.由此給您帶來的不便之處,敬請原諒!謝謝!