Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
The document containing a hyperlink is known as its source document. For example, in an online reference work such as Wikipedia, or Google, many words and terms in the text are hyperlinked to definitions of those terms. Hyperlinks are often used to implement reference mechanisms such as tables of contents, footnotes, bibliographies, indexes, letters and glossaries.

Simply look around at other sites—your clients, your partners, your providers, associations you’re a part of, local directories, or even some local communities or influencers. These are all websites that can have a huge impact on your SEO as well as help you get traffic and raise awareness for your business. You are probably already doing business with most of them, so simply ask for a mention, a case study, a testimonial, or other types of backlinks.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
As an alternative to tackling this on your own, try using a service like Mayple to match you with your industry’s best marketing experts. Mayple is a platform that connects business owners to vetted marketing experts, so you can focus on running your business and delegate the rest to experienced professionals — all you need to do is fill out a brief explaining your business’ goals. It even monitors your project’s progress and ensures your expert always delivers the best results. Get started today.