Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow custom robots text pages #1104

Merged
merged 4 commits into from
Mar 21, 2021
Merged

Allow custom robots text pages #1104

merged 4 commits into from
Mar 21, 2021

Conversation

JoelSpeed
Copy link
Member

@JoelSpeed JoelSpeed commented Mar 13, 2021

Description

This allows users to replace our static robots.txt content by using the templates dir option and placing a robots text inside the directory.

Motivation and Context

Users should be able to customise this content.
Fixes #1068

This also sets the stage for serving other static files.
At the moment, we embed CSS and other files in our pages, maybe we went to serve that as static files too?

How Has This Been Tested?

Unit testing only.

Checklist:

  • My change requires a change to the documentation or CHANGELOG.
  • I have updated the documentation/CHANGELOG accordingly.
  • I have created a feature (non-master) branch for my PR.

@JoelSpeed JoelSpeed requested a review from a team as a code owner March 13, 2021 10:00
@JoelSpeed JoelSpeed added this to In Progress (Pull Requests) in Release v7.1.0 via automation Mar 13, 2021
_, err := rw.Write(content)
if err != nil {
logger.Printf("Error writing %q: %v", pageName, err)
s.errorPageWriter.WriteErrorPage(rw, http.StatusInternalServerError, "", err.Error())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will need a req *http.Request after my RequestID PR to extract out the {{.RequestID}} from the request.

// Statis files include:
// - robots.txt
func loadStaticPages(customDir string) (map[string][]byte, error) {
pages := make(map[string][]byte)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For how this is passed around and manipulated by reference in all these functions, does this make sense as a type (or full struct) with these all as member methods manipulating it?

Copy link
Member

@NickMeves NickMeves Mar 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe a struct it looks like with the pages & customDir as member fields?

Expect(err).ToNot(HaveOccurred())

robotsTxtFile := filepath.Join(customDir, robotsTxtName)
Expect(ioutil.WriteFile(robotsTxtFile, []byte(customRobots), 0600)).To(Succeed())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Expect(ioutil.WriteFile(robotsTxtFile, []byte(customRobots), 0600)).To(Succeed())
Expect(ioutil.WriteFile(robotsTxtFile, []byte(customRobots), 0400)).To(Succeed())


var _ = Describe("Static Pages", func() {
var customDir string
const customRobots = "I AM A ROBOT!!!"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT: Can we have this custom test payload resemble the robots.txt traditional structure?

@JoelSpeed
Copy link
Member Author

@NickMeves please take a look now it has been updated, created a separate pageGetter struct, not sure whether it makes it simpler or not, WDYT?

NickMeves
NickMeves previously approved these changes Mar 21, 2021
Copy link
Member

@NickMeves NickMeves left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

I agree on the code complexity/size even with the struct route. But at least its a little bit easier to eyeball/follow than a complex map passed around to be manipulated -- thanks for making the shift!

@JoelSpeed
Copy link
Member Author

Added changelog entry, not other changes since last review

@JoelSpeed JoelSpeed merged commit 455d649 into master Mar 21, 2021
Release v7.1.0 automation moved this from In Progress (Pull Requests) to Done Mar 21, 2021
@JoelSpeed JoelSpeed deleted the robots-page branch March 21, 2021 20:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

[Question] Can I deactivate Endpoints like robots.txt?
2 participants