交流论坛

標題: Ditch duplicate titles. [打印本頁]

作者: latix60748@egvo    時間: 2025-3-6 12:28
標題: Ditch duplicate titles.

Unfortunately, there are two problems with "allinurl:". First, you can't reliably combine it with "site:", which limits your options. Two, it tends to return strange results. For example, note that the top result when I searched in the US was Amazon France. In most cases, I recommend using multiple "inurl:" statements.

58. Find scattered text files.
Website: amazon.com File type: txt -inurl: robots.txt

You may be wondering if you have some scattered documentation denmark whatsapp data files left on your site that were discovered by Google. You can use a combination of "site:" and "filetype:" to achieve this:


In this case, you need to exclude "robots.txt" (using "-inurl:") because Amazon has dozens of robots files. This combination is a great way to clean up files accidentally left on your website.

59. Dig deeper for duplicate content.
Website: amazon.com "Hot Wheels 20 Car Gift Pack"

Sites like Amazon have huge potential for internal duplicate content. Using the "site:" operator with an exact match phrase, you can start pinning close duplicates:


In this case, Google would still return about 1,000 results. Time to dig deeper…

Website: amazon.com Title: "Hot Wheels 20 Car Gift Pack"

You can use "site:" followed by "intitle:" to specifically find pages on your site that may be exact duplicates.


Believe it or not, Google still returned over 100 matching pages. Let's keep at it...

61. Find duplicates of titles and exclude them.
Website: amazon.com Title: "Hot Wheels 20 Cars Gift Pack" -inurl: review -inurl: review







歡迎光臨 交流论坛 (http://15699.ibbs.tw/) Powered by Discuz! X2.5
一粒米 | 中興米 | 論壇美工 | 設計 抗ddos | 天堂私服 | ddos | ddos | 防ddos | 防禦ddos | 防ddos主機 | 天堂美工 | 設計 防ddos主機 | 抗ddos主機 | 抗ddos | 抗ddos主機 | 抗攻擊論壇 | 天堂自動贊助 | 免費論壇 | 天堂私服 | 天堂123 | 台南清潔 | 天堂 | 天堂私服 | 免費論壇申請 | 抗ddos | 虛擬主機 | 實體主機 | vps | 網域註冊 | 抗攻擊遊戲主機 | ddos |