Posts

Showing posts from July, 2021

office 365 to gsuite

 add domain then you need to verify the domain to login and use Gmail ids  also, you need to login to each account and accept new terms and conditions to accept mails Ensure customers accept the Terms of Service Follow up with new customers When you add a new Google Workspace customer, you enter the customer's email address (do not enter your own email address).  The customer automatically receive an email with their administrator name, termporary password, and instructions to sign in to the Admin console. Follow up with your customer to make sure they sign in to the Admin console and accept the TOS.

exim spam mails email id compromised bulk mailing

 remove mail id from exim_accept_senders vi /etc/exim_accept_senders and add here and vi /etc/exim_reject_senders remove all ips from here /etc/virtual/pophosts systemctl restart dovecot systemctl restart exim tail -f /var/log/exim/mainlog ser) [195.133.40.218] F=<b.prashant@drushti.in> rejected RCPT <genijanvier@yahoo.com> 2021-07-29 13:03:58 H=(User) [195.133.40.218] F=<b.prashant@drushti.in> rejected RCPT <marilynlh@comcast.net> 2021-07-29 13:03:58 H=(User) [195.133.40.218] F=<b.prashant@drushti.in> rejected RCPT <barbthorne@live.ca> 2021-07-29 13:03:58 H=(User) [195.133.40.218] F=<b.prashant@drushti.in> rejected RCPT <mario.natividad@appliedmetering.com> 2021-07-29 13:03:58 H=(User) [195.133.40.218] F=<b.prashant@drushti.in> rejected RCPT <marinainla@aol.com> 2021-07-29 13:03:58 H=(User) [136.144.41.190] F=<b.prashant@drushti.in> rejected RCPT <favpor@aol.com> 2021-07-29 13:03:58 H=(User) [136.144.41.190] F=<

domain buy

https://tld-list.com/ savefrom.com.co

sitemap could not be read solution for all in one seo plugin and rankmath

 /home/mp3songs.fun/public_html/wp-content/plugins/all-in-one-seo-pack/app/Common/Sitemap go to this above directory open the file and change noindex to index this is located in header  line public function headers() {                 $charset = get_option( 'blog_charset' );                 header( "Content-Type: text/xml; charset=$charset", true );                 header( 'X-Robots-Tag: index, follow', true ); after doing this just index this page manually from google after some day this will be index directly and google will crawl all pages For Rank math includes/modules/sitemap/abstract-xml.php         protected function send_headers( $headers = [], $is_xsl = false ) {                 $defaults = [                         'X-Robots-Tag'  => 'index', change no index to index in X-Robots-Tag and add this as menu so taht google crawls it after checking in search console https://mp3songs.fun/sitemap_index.xml/ add / at last then no 404

AH01623 [allowmethods:error] Issue on DirectAdmin & How to Fix it

  This post is for my personal reference for future use. If you are using a VPS or a Dedicated Server and installed DirectAdmin as a Web / Server Panel, sometimes you may encounter the following error: Sat May 02 07:27:45.728967 2020] [allowmethods:error] [pid 20266:tid XXXXXXX848] [client XXXXXXX] AH01623: client method denied by server configuration: 'PATCH' to /home/XXXXX/domains/XXXXXXX, referer: https://XXXXXXXX This means that the server does not support a specific Request Method which in my case is a PATCH Request Method. So in order for this PATCH Request Method to work on the server that is using DirectAdmin, we need to enable it by: SSH to your server Type the following commands cd /usr/local/directadmin/custombuild ./build set http_methods GET:HEAD:POST:PUT:DELETE:PATCH:OPTIONS ./build rewrite_confs Fix: This is due to your file server blocking the OPTIONS http method. You will need to allow this method within your Apache configuration and ensure you have nothing lik

cross origin openlitespeed

Image
 

weekly tz backup zimbra

0 1  * * 6 /bin/sh /home/sas/script/weekly-tz-script/weekly-tz.sh 0 22 * * 5 rm -rf /Offlinebackup/tz-data/tz-data-weekly-backup/*.tgz #!/bin/sh Date1=`date +%d-%m-%Y` T1=`date` /opt/zimbra/bin/zmprov -l gaa > /home/sas/script/weekly-tz-script/emails.txt for email in `cat /home/sas/script/weekly-tz-script/emails.txt`; do  /opt/zimbra/bin/zmmailbox -z -m $email getRestURL '/?fmt=tgz' > /Offlinebackup/tz-data/tz-data-weekly-backup/$email.tgz ;  echo $email ; done > /home/sas/script/logs/opt.weeklytz.log 2>&1 T2=`date` echo "START :  " $T1 >> /home/sas/script/logs/opt.weeklytz.log echo "END   :  " $T2 >> /home/sas/script/logs/opt.weeklytz.log

Finding ahref best keywords

Image
 Search a keyword then in volume put 0-1000 and click on volume to reverse it and you will get all the keywords with n/a these are the best keywords to fight for and from lots of terms match select the one that more suitable for your need

Exim mail bounce back - retry time not reached for any host after a long failure period

  I've recently came across the following exim error when sending emails to specific domains: T=remote_smtp: retry time not reached for any host after a long failure period The first thing I did was to check that there was nothing wrong with the resolver on my server and ran a few DNS lookups from my server to the recipient's domain. Once I was sure that both the resolver and the recipient's DNS were all correct, I had to do the following: 1.) Go to: /var/spool/exim/db cd /var/spool/exim/db 2.) Delete these files: rm retry retry.lockfile wait-remote_smtp wait-remote_smtp.lockfile 3.) Then restart your exim service: service exim restart After that you should be able to send emails to the domain in question as normal. If you want to learn more about Linux in general I strongly recommend the following book:

Creating a JavaScript cookie on a domain and reading it across sub domains

  Below is a JavaScript cookie that is written on the user's computer for 12 months. After we set the cookie on our main domain such as  example.com , should the user visit a subdomain like  test.example.com , we need to continue to identify the activity of the user across our "test" subdomain. But with the current code, as soon as they leave  www.example.com  and visit  test.example.com , they are no longer flagged as "HelloWorld". Would anyone be able to help with my code to allow the cookie to be read across subdomains? <script type= "text/javascript" > var cookieName = 'HelloWorld' ; var cookieValue = 'HelloWorld' ; var myDate = new Date (); myDate. setMonth (myDate. getMonth () + 12 ); document . cookie = cookieName + "=" + cookieValue + ";expires=" + myDate; </script> Just set the  domain  and  path  attributes on your cookie, like: <script type= "text/javascript" >

htaccess subdomain with subfolders access

  I am trying to rewrite the subdomain request to a subfolder in my server using  .htaccess . I want  mail.domain.com  to look into  mail  folder located in the root. I am able to achieve this with the below code RewriteEngine on RewriteCond %{HTTP_HOST} mail.domain.com$ RewriteCond %{REQUEST_URI} !^/mail RewriteRule ^(.*)$ /mail/$ 1 [L] This WORKS correctly. When i browse to  mail.domain.com  i am getting the contents of  domain.com/mail/index.php . However this doesn't work with the subfolders inside the subdomain. ie when i browse to  mail.domain.com/installer  it DOESN'T give the contents from  domain.com/mail/installer/index.php . Instead it shows 404 error. Keep your root .htaccess like this: RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} =mail.domain.com RewriteRule ^((?!mail).*)$ mail/$ 1 [L,NC] RewriteRule ^(index\.php$|mail) - [L,NC] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . index.php [L] RewriteCond %{HTTP_HOS