Published on 00/00/0000
Last updated on 00/00/0000
Published on 00/00/0000
Last updated on 00/00/0000
Share
Share
PRODUCT
12 min read
Share
Generally speaking, code generation is the process of producing code from some sort of abstract description. Although this is not a very expressive description, almost everyone has some sort of understanding of code generation. For example, in the cloud native ecosystem generating code from Protobuf or OpenAPI descriptors is quite common. Code generation, however, is much more than that. Besides specific, (very) high-level definition languages, program code is also such abstract description. Generating code based on code is actually what happens when a compiler translates program code written in a high-level programming language into machine code for example. The compiler parses the code, optimizes it and then generates a set of instructions that our computer can understand. High-level programming languages exist, so that we don't have to speak to the machine directly. Talking to the machine is tedious, because it works differently than the human brain. Similarly, we don't handle repeating things very well either. At least in computer engineering, if something needs to be done more than once, we want to automate it. We've invented a lot of tools that help us avoid repeating code, from reusable libraries to parametric polymorphism (commonly known as generics). What if we can automate writing code as well? In other words: what if we can generate code (in the same language) from code? In this post, you can read about generating Go code using the same tools as Kubebuilder uses. As an example, we are going to implement a shallow copy generator for structs (which doesn't make much sense on its own, but serves as an excellent example).
Generating code (with some simplification) can be split into two phases:
The intermediate representation can vary, but it can be as simple as a set of parameters of a function. The important thing is to have an intermediate layer that acts as a "DTO" between the parser and the code generator, so they don't depend on each other directly. For the actual code generation, there are several common solutions in Go. They are not actually specific to Go (ie. the same idea can be applied to other languages as well), but I want to show you how you can do it in Go. The first (and probably the easiest) way is templating. Almost every language has its own templating solution (either builtin or a userland version). Go has a template engine built into its standard library, using its own template language which can be used for generating code:
package {{ .PackageName }}
func (o {{ .StructName }}) ShallowCopy() {{ .StructName }} {
return {{ .StructName }}{
{{- range $field := .Fields }}
{{ $field }}: o.{{ $field }},
{{- end }}
}
}
Given the following struct:
package pkg
type MyStruct struct {
Field1 int
Field2 string
}
We can create an intermediate representation for our template:
data := map[string]interface{}{
"PackageName": "pkg",
"StructName": "MyStruct",
"Fields": []string{
"Field1",
"Field2",
},
}
In our case, we can feed that directly into our template to generate our shallow copy function:
package pkg
func (o MyStruct) ShallowCopy() MyStruct {
return MyStruct{
Field1: o.Field1,
Field2: o.Field2,
}
}
See it in action: https://play.golang.org/p/hCRya6l61U8
Using templates has a major downside though: basically you limit yourself to the tools provided by the template engine, which can be hard or tedious to use for code generation purposes. The above example doesn't use conditionals or loops for instance. Despite you have them in Go's template language, it can very quickly become complex and hard to maintain. So while templates are usually more readable and using them is easy for simpler use cases, it can quickly get complex and tricky to maintain. Another approach that doesn't suffer from those issues is just writing the code to a buffer in plain Go code:
b := &bytes.Buffer{}
fmt.Fprintf(b, "package %s\n\n", data.PackageName)
fmt.Fprintf(b, "func (o %[1]s) ShallowCopy() %[1]s {\n", data.StructName)
fmt.Fprintf(b, "\treturn %s{\n", data.StructName)
for _, field := range data.Fields {
fmt.Fprintf(b, "\t\t%[1]s: o.%[1]s,", field)
}
fmt.Fprint(b, "\t}\n")
fmt.Fprint(b, "}\n")
See it in action: https://play.golang.org/p/UoiVSDbw88b
Comparing this solution to the template shows that there is a serious readability issue with this one, so in simple use cases it's probably better to use templating. As mentioned before, though, templating might not always be the best solution either, so we should try to address the problems of this approach. One major problem that obscures reading is the API itself. We write lines, but the actual code begins somewhere in the middle of every write operation. The other problem is indentation: you have to prefix every line with the correct amount of tab characters (technically formatting could be a separate step). One common solution addressing these issues is a custom writer that exposes a fluent API:
b := &bytes.Buffer{}
w := &genWriter{
w: b,
}
w.Wf("package %s\n", data.PackageName)
w.Wf("func (o %[1]s) ShallowCopy() %[1]s {", data.StructName)
{
w := w.Indent()
w.Wf("return %s{", data.StructName)
{
w := w.Indent()
for _, field := range data.Fields {
w.Wf("%[1]s: o.%[1]s,", field)
}
}
w.W("}")
}
w.W("}")
See it in action: https://play.golang.org/p/dS45fgUj2lJ
This is just a very simple (and obviously not the best possible) wrapper around a buffer, but it already improves readability significantly. The Go implementation of Protobuf is a great example for this approach, but under the hood Kubebuilder also uses a similar solution. Writing to a buffer (even through some sort of wrapper) - though it's probably the most common solution these days - still has its own issues. It's still not perfect from a readability perspective, it forces you to linear coding, reusing components is hard (eg. you have to take care of indentations), etc. Some libraries decided to take this to the next level, and provide a complete (fluent) API for code generation. One such library is jennifer. (Fun fact: code generation libraries are often called Jen, Jenny or Jennifer) Generating the above code with jennifer looks like this:
f := jen.NewFile(data.PackageName)
f.Func().
Params(jen.Id("o").Id(data.StructName)).
Id("ShallowCopy").
Params().
Params(jen.Id(data.StructName)).
Block(jen.Return(
jen.Id(data.StructName).Values(jen.DictFunc(func(d jen.Dict) {
for _, field := range data.Fields {
d[jen.Id(field)] = jen.Id("o").Dot(field)
}
})),
))
See it in action: https://play.golang.org/p/Vg8RxDMX6xm
This approach is not necessarily more readable though, but definitely provides a more structured and more reusable solution. It also gives you more freedom in how you want your code to be built (you are not bound to the linear nature of the buffered writer solution). The library is quite well-documented with lots of examples. Give it a try!
To sum up: there are multiple solutions and tools for producing (Go) code in Go, each has its own issues/limitations. Choose the one that fits your use case better and/or easier to use for you.
So far we focused on how the code is being generated from a custom input definition that looked something like this:
type inputData struct {
PackageName string
StructName string
Fields []string
}
// ...
data := inputData{
PackageName: "pkg",
StructName: "MyStruct",
Fields: []string{
"Field1",
"Field2",
},
}
As I mentioned earlier, the input for code generation can basically be anything. Common inputs are IDLs (Interface Definition Language), but source code itself can serve as an input for code generation. In our example, we want to generate a shallow copy function for a simple struct:
type MyStruct struct {
Field1 int
Field2 string
}
For that purpose, we need to parse the Go source code and transform it into the above shown intermediate representation. Fortunately, Go provides enough tooling for that. The go package in the standard library provides tools to parse the source code into an AST and further examine the code through the go/types package. There is also golang.org/x/tools/go/packages that helps loading code from packages/modules. While these tools are sufficient enough to implement a parser for our code generator, presenting them would fill an entire blog post of its own. Instead, I will show you a higher level framework that uses these components under the hood for orchestrating the entire generation process.
Kubebuilder is the latest SDK for building so called Operators for Kubernetes. Operators are basically resource orchestrators: you tell the operator the desired state of a resource, and the operator will hammer the target systems until the resource looks exactly like it should. From our perspective, the desired state is the important bit here. The desired state is described through special, custom Kubernetes manifests (Custom Resources - CRs). Custom resources must follow a strict specification provided by Custom Resource Definitions (CRDs). In an object oriented analogue: CRDs are classes, CRs are objects. Operators watch CRs for changes in the desired state and apply those changes to their managed resources. Kubebuilder provides various tools for creating Operators and CRDs, keeping the necessary coding at minimum. It does that by providing various code generators for generating boilerplate code for CRDs, CRs and Operators. For example, the code generation input for CRDs is a plain old struct. This struct is actually used by the operator to do work on the CR. The code generator uses the struct to generate OpenAPI schema, CRD resources, validation and deep copy functions, etc. Let's think about this for a minute. Suppose we have the following struct:
type MyStruct struct {
Field1 int
Field2 string
}
There is very limited information available in it. We know the name of the fields, we know their type. How are we supposed to derive validation rules from this information only? We obviously need a way to attach custom information to the code in order to do that. Go doesn't have an annotation system, Kubebuilder uses markers instead. Markers are special comments attaching metadata to packages, named types and struct fields:
type MyStruct struct {
// +my:validation:Min=2
Field1 int
Field2 string
}
The attached metadata can be used to:
Markers are registered in a global registry and are processed together with the type information during the code generation. The code generation framework itself can be broken down into three components:
Inputs are input parameters of the whole process or individual generators (a single code generation process can invoke multiple generators), for example the package paths to be loaded. Generators are implementations of the sigs.k8s.io/controller-tools/pkg/genall.Generator
interface. A generator instance is responsible for two things:
A very basic implementation a generator looks like this:
type Generator struct{}
func (Generator) RegisterMarkers(into *markers.Registry) error {
return nil
}
func (Generator) Generate(ctx *genall.GenerationContext) error {
// loop through the loaded packages
for _, root := range ctx.Roots {
root.NeedTypesInfo()
// loop through the types in the package
if err := markers.EachType(ctx.Collector, root, func(info *markers.TypeInfo) {
// check if the type needs code generation
// and extract the necessary information from the type
}); err != nil {
root.AddError(err)
return nil
}
// generate all the code here for a single package
// (if there is anything to generate)
// invoke the output here
}
return nil
}
Outputs are responsible for writing the generated code to files. There are basic output implementations in the library (stdout, directory, etc), but you can use your own outputs to control where the generated code gets written. These components (inputs, generators, outputs) can be used on their own, but the library also comes with a facade, that takes them as its input and coordinates the code generation using these components.
Unfortunately going through the entire code, line by line is simply not possible within the scope of this post. The core concepts and some sample codes are explained above, you can find the source code for the complete example here.
Generally, code generation is very useful for generating boilerplate code and automatically creating different representations of the same information (eg. OpenAPI descriptor -> Request/Response objects and client code). Generating default implementations and mocks are also very common, but generating interfaces is also not without precedent. At Banzai Cloud we have almost all of these use cases in our projects and we use various code generation tools. Our latest code generation practice is closely related to how we organize our applications: We follow the elegant monolith concept in most of our projects. To keep the operational overhead of our components at a minimum we ship them in a low number of artifacts (binaries/images), but in the code we keep the different services of a single application strictly separated. We also separate the business logic from the transport layers:
Integrating business logic into transport layers is usually a boring job. That's why we generate most of that code, so that we can easily integrate business services into various transport methods (using Go kit). We also generate certain event-driven handlers and mocks for interfaces using the technique explained above. Check out the tool we use for code generation for more examples.
The framework provided by Kubebuilder (controller-tools) provides an easy way to annotate and parse code for code generation purposes whereas text/template and jennifer help with the actual code generation. Regardless of the use case, this combination can be used to quickly create code generators for various purposes.
Get emerging insights on innovative technology straight to your inbox.
Discover how AI assistants can revolutionize your business, from automating routine tasks and improving employee productivity to delivering personalized customer experiences and bridging the AI skills gap.
The Shift is Outshift’s exclusive newsletter.
The latest news and updates on generative AI, quantum computing, and other groundbreaking innovations shaping the future of technology.