1 /*******************************************************************************
2 
3     Utilities to fill a struct representing the configuration with the content
4     of a YAML document.
5 
6     The main function of this module is `parseConfig`. Convenience functions
7     `parseConfigString` and `parseConfigFile` are also available.
8 
9     The type parameter to those three functions must be a struct and is used
10     to drive the processing of the YAML node. When an error is encountered,
11     an `Exception` will be thrown, with a descriptive message.
12     The rules by which the struct is filled are designed to be
13     as intuitive as possible, and are described below.
14 
15     Optional_Fields:
16       One of the major convenience offered by this utility is its handling
17       of optional fields. A field is detected as optional if it has
18       an initializer that is different from its type `init` value,
19       for example `string field = "Something";` is an optional field,
20       but `int count = 0;` is not.
21       To mark a field as optional even with its default value,
22       use the `Optional` UDA: `@Optional int count = 0;`.
23 
24     fromYAML:
25       Because config structs may contain complex types outside of the project's
26       control (e.g. a Phobos type, Vibe.d's `URL`, etc...) or one may want
27       the config format to be more dynamic (e.g. by exposing union-like behavior),
28       one may need to apply more custom logic than what Configy does.
29       For this use case, one can define a `fromYAML` static method in the type:
30       `static S fromYAML(scope ConfigParser!S parser)`, where `S` is the type of
31       the enclosing structure. Structs with `fromYAML` will have this method
32       called instead of going through the normal parsing rules.
33       The `ConfigParser` exposes the current path of the field, as well as the
34       raw YAML `Node` itself, allowing for maximum flexibility.
35 
36     Composite_Types:
37       Processing starts from a `struct` at the top level, and recurse into
38       every fields individually. If a field is itself a struct,
39       the filler will attempt the following, in order:
40       - If the field has no value and is not optional, an Exception will
41         be thrown with an error message detailing where the issue happened.
42       - If the field has no value and is optional, the default value will
43         be used.
44       - If the field has a value, the filler will first check for a converter
45         and use it if present.
46       - If the type has a `static` method named `fromString` whose sole argument
47         is a `string`, it will be used.
48       - If the type has a constructor whose sole argument is a `string`,
49         it will be used;
50       - Finally, the filler will attempt to deserialize all struct members
51         one by one and pass them to the default constructor, if there is any.
52       - If none of the above succeeded, a `static assert` will trigger.
53 
54     Alias_this:
55       If a `struct` contains an `alias this`, the field that is aliased will be
56       ignored, instead the config parser will parse nested fields as if they
57       were part of the enclosing structure. This allow to re-use a single `struct`
58       in multiple place without having to resort to a `mixin template`.
59       Having an initializer will make all fields in the aliased struct optional.
60       The aliased field cannot have attributes other than `@Optional`,
61       which will then apply to all fields it exposes.
62 
63     Duration_parsing:
64       If the config field is of type `core.time.Duration`, special parsing rules
65       will apply. There are two possible forms in which a Duration field may
66       be expressed. In the first form, the YAML node should be a mapping,
67       and it will be checked for fields matching the supported units
68       in `core.time`: `weeks`, `days`, `hours`, `minutes`, `seconds`, `msecs`,
69       `usecs`, `hnsecs`, `nsecs`. Strict parsing option will be respected.
70       The values of the fields will then be added together, so the following
71       YAML usages are equivalent:
72       ---
73       // sleepFor:
74       //   hours: 8
75       //   minutes: 30
76       ---
77       and:
78       ---
79       // sleepFor:
80       //   minutes: 510
81       ---
82       Provided that the definition of the field is:
83       ---
84       public Duration sleepFor;
85       ---
86 
87       In the second form, the field should have a suffix composed of an
88       underscore ('_'), followed by a unit name as defined in `core.time`.
89       This can be either the field name directly, or a name override.
90       The latter is recommended to avoid confusion when using the field in code.
91       In this form, the YAML node is expected to be a scalar.
92       So the previous example, using this form, would be expressed as:
93       ---
94       sleepFor_minutes: 510
95       ---
96       and the field definition should be one of those two:
97       ---
98       public @Name("sleepFor_minutes") Duration sleepFor; /// Prefer this
99       public Duration sleepFor_minutes; /// This works too
100       ---
101 
102       Those forms are mutually exclusive, so a field with a unit suffix
103       will error out if a mapping is used. This prevents surprises and ensures
104       that the error message, if any, is consistent across user input.
105 
106       To disable or change this behavior, one may use a `Converter` instead.
107 
108     Strict_Parsing:
109       When strict parsing is enabled, the config filler will also validate
110       that the YAML nodes do not contains entry which are not present in the
111       mapping (struct) being processed.
112       This can be useful to catch typos or outdated configuration options.
113 
114     Post_Validation:
115       Some configuration will require validation across multiple sections.
116       For example, two sections may be mutually exclusive as a whole,
117       or may have fields which are mutually exclusive with another section's
118       field(s). This kind of dependence is hard to account for declaratively,
119       and does not affect parsing. For this reason, the preferred way to
120       handle those cases is to define a `validate` member method on the
121       affected config struct(s), which will be called once
122       parsing for that mapping is completed.
123       If an error is detected, this method should throw an Exception.
124 
125     Enabled_or_disabled_field:
126       While most complex logic validation should be handled post-parsing,
127       some section may be optional by default, but if provided, will have
128       required fields. To support this use case, if a field with the name
129       `enabled` is present in a struct, the parser will first process it.
130       If it is `false`, the parser will not attempt to process the struct
131       further, and the other fields will have their default value.
132       Likewise, if a field named `disabled` exists, the struct will not
133       be processed if it is set to `true`.
134 
135     Copyright:
136         Copyright (c) 2019-2022 BOSAGORA Foundation
137         All rights reserved.
138 
139     License:
140         MIT License. See LICENSE for details.
141 
142 *******************************************************************************/
143 
144 module dub.internal.configy.Read;
145 
146 public import dub.internal.configy.Attributes;
147 public import dub.internal.configy.Exceptions : ConfigException;
148 import dub.internal.configy.Exceptions;
149 import dub.internal.configy.FieldRef;
150 import dub.internal.configy.Utils;
151 
152 import dub.internal.dyaml.exception;
153 import dub.internal.dyaml.node;
154 import dub.internal.dyaml.loader;
155 
156 import std.algorithm;
157 import std.conv;
158 import std.datetime;
159 import std.format;
160 import std.getopt;
161 import std.meta;
162 import std.range;
163 import std.traits;
164 import std.typecons : Nullable, nullable, tuple;
165 
166 static import core.time;
167 
168 // Dub-specific adjustments for output
169 import dub.internal.logging;
170 
171 /// Command-line arguments
172 public struct CLIArgs
173 {
174     /// Path to the config file
175     public string config_path = "config.yaml";
176 
177     /// Overrides for config options
178     public string[][string] overrides;
179 
180     /// Helper to add items to `overrides`
181     public void overridesHandler (string, string value)
182     {
183         import std.string;
184         const idx = value.indexOf('=');
185         if (idx < 0) return;
186         string k = value[0 .. idx], v = value[idx + 1 .. $];
187         if (auto val = k in this.overrides)
188             (*val) ~= v;
189         else
190             this.overrides[k] = [ v ];
191     }
192 
193     /***************************************************************************
194 
195         Parses the base command line arguments
196 
197         This can be composed with the program argument.
198         For example, consider a program which wants to expose a `--version`
199         switch, the definition could look like this:
200         ---
201         public struct ProgramCLIArgs
202         {
203             public CLIArgs base; // This struct
204 
205             public alias base this; // For convenience
206 
207             public bool version_; // Program-specific part
208         }
209         ---
210         Then, an application-specific configuration routine would be:
211         ---
212         public GetoptResult parse (ref ProgramCLIArgs clargs, ref string[] args)
213         {
214             auto r = clargs.base.parse(args);
215             if (r.helpWanted) return r;
216             return getopt(
217                 args,
218                 "version", "Print the application version, &clargs.version_");
219         }
220         ---
221 
222         Params:
223           args = The command line args to parse (parsed options will be removed)
224           passThrough = Whether to enable `config.passThrough` and
225                         `config.keepEndOfOptions`. `true` by default, to allow
226                         composability. If your program doesn't have other
227                         arguments, pass `false`.
228 
229         Returns:
230           The result of calling `getopt`
231 
232     ***************************************************************************/
233 
234     public GetoptResult parse (ref string[] args, bool passThrough = true)
235     {
236         return getopt(
237             args,
238             // `caseInsensitive` is the default, but we need something
239             // with the same type for the ternary
240             passThrough ? config.keepEndOfOptions : config.caseInsensitive,
241             // Also the default, same reasoning
242             passThrough ? config.passThrough : config.noPassThrough,
243             "config|c",
244                 "Path to the config file. Defaults to: " ~ this.config_path,
245                 &this.config_path,
246 
247             "override|O",
248                 "Override a config file value\n" ~
249                 "Example: -O foo.bar=true -o dns=1.1.1.1 -o dns=2.2.2.2\n" ~
250                 "Array values are additive, other items are set to the last override",
251                 &this.overridesHandler,
252         );
253     }
254 }
255 
256 /*******************************************************************************
257 
258     Attempt to read and process the config file at `path`, print any error
259 
260     This 'simple' overload of the more detailed `parseConfigFile` will attempt
261     to read the file at `path`, and return a `Nullable` instance of it.
262     If an error happens, either because the file isn't readable or
263     the configuration has an issue, a message will be printed to `stderr`,
264     with colors if the output is a TTY, and a `null` instance will be returned.
265 
266     The calling code can hence just read a config file via:
267     ```
268     int main ()
269     {
270         auto configN = parseConfigFileSimple!Config("config.yaml");
271         if (configN.isNull()) return 1; // Error path
272         auto config = configN.get();
273         // Rest of the program ...
274     }
275     ```
276     An overload accepting `CLIArgs args` also exists.
277 
278     Params:
279         path = Path of the file to read from
280         args = Command line arguments on which `parse` has been called
281         strict = Whether the parsing should reject unknown keys in the
282                  document, warn, or ignore them (default: `StrictMode.Error`)
283 
284     Returns:
285         An initialized `Config` instance if reading/parsing was successful;
286         a `null` instance otherwise.
287 
288 *******************************************************************************/
289 
290 public Nullable!T parseConfigFileSimple (T) (string path, StrictMode strict = StrictMode.Error)
291 {
292     return parseConfigFileSimple!(T)(CLIArgs(path), strict);
293 }
294 
295 
296 /// Ditto
297 public Nullable!T parseConfigFileSimple (T) (in CLIArgs args, StrictMode strict = StrictMode.Error)
298 {
299     try
300     {
301         Node root = Loader.fromFile(args.config_path).load();
302         return nullable(parseConfig!T(args, root, strict));
303     }
304     catch (ConfigException exc)
305     {
306         exc.printException();
307         return typeof(return).init;
308     }
309     catch (Exception exc)
310     {
311         // Other Exception type may be thrown by D-YAML,
312         // they won't include rich information.
313         logWarn("%s", exc.message());
314         return typeof(return).init;
315     }
316 }
317 
318 /*******************************************************************************
319 
320     Print an Exception, potentially with colors on
321 
322     Trusted because of `stderr` usage.
323 
324 *******************************************************************************/
325 
326 private void printException (scope ConfigException exc) @trusted
327 {
328     import dub.internal.logging;
329 
330     if (hasColors)
331         logWarn("%S", exc);
332     else
333         logWarn("%s", exc.message());
334 }
335 
336 /*******************************************************************************
337 
338     Parses the config file or string and returns a `Config` instance.
339 
340     Params:
341         cmdln = command-line arguments (containing the path to the config)
342         path = When parsing a string, the path corresponding to it
343         strict = Whether the parsing should reject unknown keys in the
344                  document, warn, or ignore them (default: `StrictMode.Error`)
345 
346     Throws:
347         `Exception` if parsing the config file failed.
348 
349     Returns:
350         `Config` instance
351 
352 *******************************************************************************/
353 
354 public T parseConfigFile (T) (in CLIArgs cmdln, StrictMode strict = StrictMode.Error)
355 {
356     Node root = Loader.fromFile(cmdln.config_path).load();
357     return parseConfig!T(cmdln, root, strict);
358 }
359 
360 /// ditto
361 public T parseConfigString (T) (string data, string path, StrictMode strict = StrictMode.Error)
362 {
363     CLIArgs cmdln = { config_path: path };
364     auto loader = Loader.fromString(data);
365     loader.name = path;
366     Node root = loader.load();
367     return parseConfig!T(cmdln, root, strict);
368 }
369 
370 /*******************************************************************************
371 
372     Process the content of the YAML document described by `node` into an
373     instance of the struct `T`.
374 
375     See the module description for a complete overview of this function.
376 
377     Params:
378       T = Type of the config struct to fill
379       cmdln = Command line arguments
380       node = The root node matching `T`
381       strict = Action to take when encountering unknown keys in the document
382 
383     Returns:
384       An instance of `T` filled with the content of `node`
385 
386     Throws:
387       If the content of `node` cannot satisfy the requirements set by `T`,
388       or if `node` contain extra fields and `strict` is `true`.
389 
390 *******************************************************************************/
391 
392 public T parseConfig (T) (
393     in CLIArgs cmdln, Node node, StrictMode strict = StrictMode.Error)
394 {
395     static assert(is(T == struct), "`" ~ __FUNCTION__ ~
396                   "` should only be called with a `struct` type as argument, not: `" ~
397                   fullyQualifiedName!T ~ "`");
398 
399     final switch (node.nodeID)
400     {
401     case NodeID.mapping:
402             dbgWrite("Parsing config '%s', strict: %s",
403                      fullyQualifiedName!T,
404                      strict == StrictMode.Warn ?
405                        strict.paint(Yellow) : strict.paintIf(!!strict, Green, Red));
406             return node.parseField!(StructFieldRef!T)(
407                 null, T.init, const(Context)(cmdln, strict));
408     case NodeID.sequence:
409     case NodeID.scalar:
410     case NodeID.invalid:
411         throw new TypeConfigException(node, "mapping (object)", "document root");
412     }
413 }
414 
415 /*******************************************************************************
416 
417     The behavior to have when encountering a field in YAML not present
418     in the config definition.
419 
420 *******************************************************************************/
421 
422 public enum StrictMode
423 {
424     /// Issue an error by throwing an `UnknownKeyConfigException`
425     Error  = 0,
426     /// Write a message to `stderr`, but continue processing the file
427     Warn   = 1,
428     /// Be silent and do nothing
429     Ignore = 2,
430 }
431 
432 /// Used to pass around configuration
433 package struct Context
434 {
435     ///
436     private CLIArgs cmdln;
437 
438     ///
439     private StrictMode strict;
440 }
441 
442 /*******************************************************************************
443 
444     Parse a mapping from `node` into an instance of `T`
445 
446     Params:
447       TLFR = Top level field reference for this mapping
448       node = The YAML node object matching the struct being read
449       path = The runtime path to this mapping, used for nested types
450       defaultValue = The default value to use for `T`, which can be different
451                      from `T.init` when recursing into fields with initializers.
452       ctx = A context where properties that need to be conserved during
453             recursion are stored
454       fieldDefaults = Default value for some fields, used for `Key` recursion
455 
456 *******************************************************************************/
457 private TLFR.Type parseMapping (alias TLFR)
458     (Node node, string path, auto ref TLFR.Type defaultValue,
459      in Context ctx, in Node[string] fieldDefaults)
460 {
461     static assert(is(TLFR.Type == struct), "`parseMapping` called with wrong type (should be a `struct`)");
462     assert(node.nodeID == NodeID.mapping, "Internal error: parseMapping shouldn't have been called");
463 
464     dbgWrite("%s: `parseMapping` called for '%s' (node entries: %s)",
465              TLFR.Type.stringof.paint(Cyan), path.paint(Cyan),
466              node.length.paintIf(!!node.length, Green, Red));
467 
468     static foreach (FR; FieldRefTuple!(TLFR.Type))
469     {
470         static if (FR.Name != FR.FieldName && hasMember!(TLFR.Type, FR.Name) &&
471                    !is(typeof(mixin("TLFR.Type.", FR.Name)) == function))
472             static assert (FieldRef!(TLFR.Type, FR.Name).Name != FR.Name,
473                            "Field `" ~ FR.FieldName ~ "` `@Name` attribute shadows field `" ~
474                            FR.Name ~ "` in `" ~ TLFR.Type.stringof ~ "`: Add a `@Name` attribute to `" ~
475                            FR.Name ~ "` or change that of `" ~ FR.FieldName ~ "`");
476     }
477 
478     if (ctx.strict != StrictMode.Ignore)
479     {
480         /// First, check that all the sections found in the mapping are present in the type
481         /// If not, the user might have made a typo.
482         immutable string[] fieldNames = [ FieldsName!(TLFR.Type) ];
483         immutable string[] patterns = [ Patterns!(TLFR.Type) ];
484     FIELD: foreach (const ref Node key, const ref Node value; node)
485         {
486             const k = key.as!string;
487             if (!fieldNames.canFind(k))
488             {
489                 foreach (p; patterns)
490                     if (k.startsWith(p))
491                         // Require length because `0` would match `canFind`
492                         // and we don't want to allow `$PATTERN-`
493                         if (k[p.length .. $].length > 1 && k[p.length] == '-')
494                             continue FIELD;
495 
496                 if (ctx.strict == StrictMode.Warn)
497                 {
498                     scope exc = new UnknownKeyConfigException(
499                         path, key.as!string, fieldNames, key.startMark());
500                     exc.printException();
501                 }
502                 else
503                     throw new UnknownKeyConfigException(
504                         path, key.as!string, fieldNames, key.startMark());
505             }
506         }
507     }
508 
509     const enabledState = node.isMappingEnabled!(TLFR.Type)(defaultValue);
510 
511     if (enabledState.field != EnabledState.Field.None)
512         dbgWrite("%s: Mapping is enabled: %s", TLFR.Type.stringof.paint(Cyan), (!!enabledState).paintBool());
513 
514     auto convertField (alias FR) ()
515     {
516         static if (FR.Name != FR.FieldName)
517             dbgWrite("Field name `%s` will use YAML field `%s`",
518                      FR.FieldName.paint(Yellow), FR.Name.paint(Green));
519         // Using exact type here matters: we could get a qualified type
520         // (e.g. `immutable(string)`) if the field is qualified,
521         // which causes problems.
522         FR.Type default_ = __traits(getMember, defaultValue, FR.FieldName);
523 
524         // If this struct is disabled, do not attempt to parse anything besides
525         // the `enabled` / `disabled` field.
526         if (!enabledState)
527         {
528             // Even this is too noisy
529             version (none)
530                 dbgWrite("%s: %s field of disabled struct, default: %s",
531                          path.paint(Cyan), "Ignoring".paint(Yellow), default_);
532 
533             static if (FR.Name == "enabled")
534                 return false;
535             else static if (FR.Name == "disabled")
536                 return true;
537             else
538                 return default_;
539         }
540 
541         if (auto ptr = FR.FieldName in fieldDefaults)
542         {
543             dbgWrite("Found %s (%s.%s) in `fieldDefaults`",
544                      FR.Name.paint(Cyan), path.paint(Cyan), FR.FieldName.paint(Cyan));
545 
546             if (ctx.strict && FR.FieldName in node)
547                 throw new ConfigExceptionImpl("'Key' field is specified twice", path, FR.FieldName, node.startMark());
548             return (*ptr).parseField!(FR)(path.addPath(FR.FieldName), default_, ctx)
549                 .dbgWriteRet("Using value '%s' from fieldDefaults for field '%s'",
550                              FR.FieldName.paint(Cyan));
551         }
552 
553         // This, `FR.Pattern`, and the field in `@Name` are special support for `dub`
554         static if (FR.Pattern)
555         {
556             static if (is(FR.Type : V[K], K, V))
557             {
558                 alias AAFieldRef = NestedFieldRef!(V, FR);
559                 static assert(is(K : string), "Key type should be string-like");
560             }
561             else
562                 static assert(0, "Cannot have pattern on non-AA field");
563 
564             AAFieldRef.Type[string] result;
565             foreach (pair; node.mapping)
566             {
567                 const key = pair.key.as!string;
568                 if (!key.startsWith(FR.Name))
569                     continue;
570                 string suffix = key[FR.Name.length .. $];
571                 if (suffix.length)
572                 {
573                     if (suffix[0] == '-') suffix = suffix[1 .. $];
574                     else continue;
575                 }
576 
577                 result[suffix] = pair.value.parseField!(AAFieldRef)(
578                     path.addPath(key), default_.get(key, AAFieldRef.Type.init), ctx);
579             }
580             bool hack = true;
581             if (hack) return result;
582         }
583 
584         if (auto ptr = FR.Name in node)
585         {
586             dbgWrite("%s: YAML field is %s in node%s",
587                      FR.Name.paint(Cyan), "present".paint(Green),
588                      (FR.Name == FR.FieldName ? "" : " (note that field name is overriden)").paint(Yellow));
589             return (*ptr).parseField!(FR)(path.addPath(FR.Name), default_, ctx)
590                 .dbgWriteRet("Using value '%s' from YAML document for field '%s'",
591                              FR.FieldName.paint(Cyan));
592         }
593 
594         dbgWrite("%s: Field is %s from node%s",
595                  FR.Name.paint(Cyan), "missing".paint(Red),
596                  (FR.Name == FR.FieldName ? "" : " (note that field name is overriden)").paint(Yellow));
597 
598         // A field is considered optional if it has an initializer that is different
599         // from its default value, or if it has the `Optional` UDA.
600         // In that case, just return this value.
601         static if (FR.Optional)
602             return default_
603                 .dbgWriteRet("Using default value '%s' for optional field '%s'", FR.FieldName.paint(Cyan));
604 
605         // The field is not present, but it could be because it is an optional section.
606         // For example, the section could be defined as:
607         // ---
608         // struct RequestLimit { size_t reqs = 100; }
609         // struct Config { RequestLimit limits; }
610         // ---
611         // In this case we need to recurse into `RequestLimit` to check if any
612         // of its field is required.
613         else static if (mightBeOptional!FR)
614         {
615             const npath = path.addPath(FR.Name);
616             string[string] aa;
617             return Node(aa).parseMapping!(FR)(npath, default_, ctx, null);
618         }
619         else
620             throw new MissingKeyException(path, FR.Name, node.startMark());
621     }
622 
623     FR.Type convert (alias FR) ()
624     {
625         static if (__traits(getAliasThis, TLFR.Type).length == 1 &&
626                    __traits(getAliasThis, TLFR.Type)[0] == FR.FieldName)
627         {
628             static assert(FR.Name == FR.FieldName,
629                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
630                           "` is the target of an `alias this` and cannot have a `@Name` attribute");
631             static assert(!hasConverter!(FR.Ref),
632                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
633                           "` is the target of an `alias this` and cannot have a `@Converter` attribute");
634 
635             alias convertW(string FieldName) = convert!(FieldRef!(FR.Type, FieldName, FR.Optional));
636             return FR.Type(staticMap!(convertW, FieldNameTuple!(FR.Type)));
637         }
638         else
639             return convertField!(FR)();
640     }
641 
642     debug (ConfigFillerDebug)
643     {
644         indent++;
645         scope (exit) indent--;
646     }
647 
648     TLFR.Type doValidation (TLFR.Type result)
649     {
650         static if (is(typeof(result.validate())))
651         {
652             if (enabledState)
653             {
654                 dbgWrite("%s: Calling `%s` method",
655                      TLFR.Type.stringof.paint(Cyan), "validate()".paint(Green));
656                 result.validate();
657             }
658             else
659             {
660                 dbgWrite("%s: Ignoring `%s` method on disabled mapping",
661                          TLFR.Type.stringof.paint(Cyan), "validate()".paint(Green));
662             }
663         }
664         else if (enabledState)
665             dbgWrite("%s: No `%s` method found",
666                      TLFR.Type.stringof.paint(Cyan), "validate()".paint(Yellow));
667 
668         return result;
669     }
670 
671     // This might trigger things like "`this` is not accessible".
672     // In this case, the user most likely needs to provide a converter.
673     alias convertWrapper(string FieldName) = convert!(FieldRef!(TLFR.Type, FieldName));
674     return doValidation(TLFR.Type(staticMap!(convertWrapper, FieldNameTuple!(TLFR.Type))));
675 }
676 
677 /*******************************************************************************
678 
679     Parse a field, trying to match up the compile-time expectation with
680     the run time value of the Node (`nodeID`).
681 
682     This is the central point which does "type conversion", from the YAML node
683     to the field type. Whenever adding support for a new type, things should
684     happen here.
685 
686     Because a `struct` can be filled from either a mapping or a scalar,
687     this function will first try the converter / fromString / string ctor
688     methods before defaulting to field-wise construction.
689 
690     Note that optional fields are checked before recursion happens,
691     so this method does not do this check.
692 
693 *******************************************************************************/
694 
695 package FR.Type parseField (alias FR)
696     (Node node, string path, auto ref FR.Type defaultValue, in Context ctx)
697 {
698     if (node.nodeID == NodeID.invalid)
699         throw new TypeConfigException(node, "valid", path);
700 
701     // If we reached this, it means the field is set, so just recurse
702     // to peel the type
703     static if (is(FR.Type : SetInfo!FT, FT))
704         return FR.Type(
705             parseField!(FieldRef!(FR.Type, "value"))(node, path, defaultValue, ctx),
706             true);
707 
708     else static if (hasConverter!(FR.Ref))
709         return wrapException(node.viaConverter!(FR)(path, ctx), path, node.startMark());
710 
711     else static if (hasFromYAML!(FR.Type))
712     {
713         scope impl = new ConfigParserImpl!(FR.Type)(node, path, ctx);
714         return wrapException(FR.Type.fromYAML(impl), path, node.startMark());
715     }
716 
717     else static if (hasFromString!(FR.Type))
718         return wrapException(FR.Type.fromString(node.as!string), path, node.startMark());
719 
720     else static if (hasStringCtor!(FR.Type))
721         return wrapException(FR.Type(node.as!string), path, node.startMark());
722 
723     else static if (is(immutable(FR.Type) == immutable(core.time.Duration)))
724     {
725         if (node.nodeID != NodeID.mapping)
726             throw new DurationTypeConfigException(node, path);
727         return node.parseMapping!(StructFieldRef!DurationMapping)(
728             path, DurationMapping.make(defaultValue), ctx, null).opCast!Duration;
729     }
730 
731     else static if (is(FR.Type == struct))
732     {
733         if (node.nodeID != NodeID.mapping)
734             throw new TypeConfigException(node, "mapping (object)", path);
735         return node.parseMapping!(FR)(path, defaultValue, ctx, null);
736     }
737 
738     // Handle string early as they match the sequence rule too
739     else static if (isSomeString!(FR.Type))
740         // Use `string` type explicitly because `Variant` thinks
741         // `immutable(char)[]` (aka `string`) and `immutable(char[])`
742         // (aka `immutable(string)`) are not compatible.
743         return node.parseScalar!(string)(path);
744     // Enum too, as their base type might be an array (including strings)
745     else static if (is(FR.Type == enum))
746         return node.parseScalar!(FR.Type)(path);
747 
748     else static if (is(FR.Type : E[K], E, K))
749     {
750         if (node.nodeID != NodeID.mapping)
751             throw new TypeConfigException(node, "mapping (associative array)", path);
752 
753         // Note: As of June 2022 (DMD v2.100.0), associative arrays cannot
754         // have initializers, hence their UX for config is less optimal.
755         return node.mapping().map!(
756                 (Node.Pair pair) {
757                     return tuple(
758                         pair.key.get!K,
759                         pair.value.parseField!(NestedFieldRef!(E, FR))(
760                             format("%s[%s]", path, pair.key.as!string), E.init, ctx));
761                 }).assocArray();
762 
763     }
764     else static if (is(FR.Type : E[], E))
765     {
766         static if (hasUDA!(FR.Ref, Key))
767         {
768             static assert(getUDAs!(FR.Ref, Key).length == 1,
769                           "`" ~ fullyQualifiedName!(FR.Ref) ~
770                           "` field shouldn't have more than one `Key` attribute");
771             static assert(is(E == struct),
772                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
773                           "` has a `Key` attribute, but is a sequence of `" ~
774                           fullyQualifiedName!E ~ "`, not a sequence of `struct`");
775 
776             string key = getUDAs!(FR.Ref, Key)[0].name;
777 
778             if (node.nodeID != NodeID.mapping && node.nodeID != NodeID.sequence)
779                 throw new TypeConfigException(node, "mapping (object) or sequence", path);
780 
781             if (node.nodeID == NodeID.mapping) return node.mapping().map!(
782                 (Node.Pair pair) {
783                     if (pair.value.nodeID != NodeID.mapping)
784                         throw new TypeConfigException(
785                             "sequence of " ~ pair.value.nodeTypeString(),
786                             "sequence of mapping (array of objects)",
787                             path, null, node.startMark());
788 
789                     return pair.value.parseMapping!(StructFieldRef!E)(
790                         path.addPath(pair.key.as!string),
791                         E.init, ctx, key.length ? [ key: pair.key ] : null);
792                 }).array();
793         }
794         if (node.nodeID != NodeID.sequence)
795             throw new TypeConfigException(node, "sequence (array)", path);
796 
797         typeof(return) validateLength (E[] res)
798         {
799             static if (is(FR.Type : E_[k], E_, size_t k))
800             {
801                 if (res.length != k)
802                     throw new ArrayLengthException(
803                         res.length, k, path, null, node.startMark());
804                 return res[0 .. k];
805             }
806             else
807                 return res;
808         }
809 
810         // We pass `E.init` as default value as it is not going to be used:
811         // Either there is something in the YAML document, and that will be
812         // converted, or `sequence` will not iterate.
813         return validateLength(
814             node.sequence.enumerate.map!(
815             kv => kv.value.parseField!(NestedFieldRef!(E, FR))(
816                 format("%s[%s]", path, kv.index), E.init, ctx))
817             .array()
818         );
819     }
820     else
821     {
822         static assert (!is(FR.Type == union),
823                        "`union` are not supported. Use a converter instead");
824         return node.parseScalar!(FR.Type)(path);
825     }
826 }
827 
828 /// Parse a node as a scalar
829 private T parseScalar (T) (Node node, string path)
830 {
831     if (node.nodeID != NodeID.scalar)
832         throw new TypeConfigException(node, "scalar (value)", path);
833 
834     static if (is(T == enum))
835         return node.as!string.to!(T);
836     else
837         return node.as!(T);
838 }
839 
840 /*******************************************************************************
841 
842     Write a potentially throwing user-provided expression in ConfigException
843 
844     The user-provided hooks may throw (e.g. `fromString / the constructor),
845     and the error may or may not be clear. We can't do anything about a bad
846     message but we can wrap the thrown exception in a `ConfigException`
847     to provide the location in the yaml file where the error happened.
848 
849     Params:
850       exp = The expression that may throw
851       path = Path within the config file of the field
852       position = Position of the node in the YAML file
853       file = Call site file (otherwise the message would point to this function)
854       line = Call site line (see `file` reasoning)
855 
856     Returns:
857       The result of `exp` evaluation.
858 
859 *******************************************************************************/
860 
861 private T wrapException (T) (lazy T exp, string path, Mark position,
862     string file = __FILE__, size_t line = __LINE__)
863 {
864     try
865         return exp;
866     catch (ConfigException exc)
867         throw exc;
868     catch (Exception exc)
869         throw new ConstructionException(exc, path, position, file, line);
870 }
871 
872 /// Allows us to reuse parseMapping and strict parsing
873 private struct DurationMapping
874 {
875     public SetInfo!long weeks;
876     public SetInfo!long days;
877     public SetInfo!long hours;
878     public SetInfo!long minutes;
879     public SetInfo!long seconds;
880     public SetInfo!long msecs;
881     public SetInfo!long usecs;
882     public SetInfo!long hnsecs;
883     public SetInfo!long nsecs;
884 
885     private static DurationMapping make (Duration def) @safe pure nothrow @nogc
886     {
887         typeof(return) result;
888         auto fullSplit = def.split();
889         result.weeks = SetInfo!long(fullSplit.weeks, fullSplit.weeks != 0);
890         result.days = SetInfo!long(fullSplit.days, fullSplit.days != 0);
891         result.hours = SetInfo!long(fullSplit.hours, fullSplit.hours != 0);
892         result.minutes = SetInfo!long(fullSplit.minutes, fullSplit.minutes != 0);
893         result.seconds = SetInfo!long(fullSplit.seconds, fullSplit.seconds != 0);
894         result.msecs = SetInfo!long(fullSplit.msecs, fullSplit.msecs != 0);
895         result.usecs = SetInfo!long(fullSplit.usecs, fullSplit.usecs != 0);
896         result.hnsecs = SetInfo!long(fullSplit.hnsecs, fullSplit.hnsecs != 0);
897         // nsecs is ignored by split as it's not representable in `Duration`
898         return result;
899     }
900 
901     ///
902     public void validate () const @safe
903     {
904         // That check should never fail, as the YAML parser would error out,
905         // but better be safe than sorry.
906         foreach (field; this.tupleof)
907             if (field.set)
908                 return;
909 
910         throw new Exception(
911             "Expected at least one of the components (weeks, days, hours, " ~
912             "minutes, seconds, msecs, usecs, hnsecs, nsecs) to be set");
913     }
914 
915     ///  Allow conversion to a `Duration`
916     public Duration opCast (T : Duration) () const scope @safe pure nothrow @nogc
917     {
918         return core.time.weeks(this.weeks) + core.time.days(this.days) +
919             core.time.hours(this.hours) + core.time.minutes(this.minutes) +
920             core.time.seconds(this.seconds) + core.time.msecs(this.msecs) +
921             core.time.usecs(this.usecs) + core.time.hnsecs(this.hnsecs) +
922             core.time.nsecs(this.nsecs);
923     }
924 }
925 
926 /// Evaluates to `true` if we should recurse into the struct via `parseMapping`
927 private enum mightBeOptional (alias FR) = is(FR.Type == struct) &&
928     !is(immutable(FR.Type) == immutable(core.time.Duration)) &&
929     !hasConverter!(FR.Ref) && !hasFromString!(FR.Type) &&
930     !hasStringCtor!(FR.Type) && !hasFromYAML!(FR.Type);
931 
932 /// Convenience template to check for the presence of converter(s)
933 private enum hasConverter (alias Field) = hasUDA!(Field, Converter);
934 
935 /// Provided a field reference `FR` which is known to have at least one converter,
936 /// perform basic checks and return the value after applying the converter.
937 private auto viaConverter (alias FR) (Node node, string path, in Context context)
938 {
939     enum Converters = getUDAs!(FR.Ref, Converter);
940     static assert (Converters.length,
941                    "Internal error: `viaConverter` called on field `" ~
942                    FR.FieldName ~ "` with no converter");
943 
944     static assert(Converters.length == 1,
945                   "Field `" ~ FR.FieldName ~ "` cannot have more than one `Converter`");
946 
947     scope impl = new ConfigParserImpl!(FR.Type)(node, path, context);
948     return Converters[0].converter(impl);
949 }
950 
951 private final class ConfigParserImpl (T) : ConfigParser!T
952 {
953     private Node node_;
954     private string path_;
955     private const(Context) context_;
956 
957     /// Ctor
958     public this (Node n, string p, const Context c) scope @safe pure nothrow @nogc
959     {
960         this.node_ = n;
961         this.path_ = p;
962         this.context_ = c;
963     }
964 
965     public final override inout(Node) node () inout @safe pure nothrow @nogc
966     {
967         return this.node_;
968     }
969 
970     public final override string path () const @safe pure nothrow @nogc
971     {
972         return this.path_;
973     }
974 
975     protected final override const(Context) context () const @safe pure nothrow @nogc
976     {
977         return this.context_;
978     }
979 }
980 
981 /// Helper predicate
982 private template NameIs (string searching)
983 {
984     enum bool Pred (alias FR) = (searching == FR.Name);
985 }
986 
987 /// Returns whether or not the field has a `enabled` / `disabled` field,
988 /// and its value. If it does not, returns `true`.
989 private EnabledState isMappingEnabled (M) (Node node, auto ref M default_)
990 {
991     import std.meta : Filter;
992 
993     alias EMT = Filter!(NameIs!("enabled").Pred, FieldRefTuple!M);
994     alias DMT = Filter!(NameIs!("disabled").Pred, FieldRefTuple!M);
995 
996     static if (EMT.length)
997     {
998         static assert (DMT.length == 0,
999                        "`enabled` field `" ~ EMT[0].FieldName ~
1000                        "` conflicts with `disabled` field `" ~ DMT[0].FieldName ~ "`");
1001 
1002         if (auto ptr = "enabled" in node)
1003             return EnabledState(EnabledState.Field.Enabled, (*ptr).as!bool);
1004         return EnabledState(EnabledState.Field.Enabled, __traits(getMember, default_, EMT[0].FieldName));
1005     }
1006     else static if (DMT.length)
1007     {
1008         if (auto ptr = "disabled" in node)
1009             return EnabledState(EnabledState.Field.Disabled, (*ptr).as!bool);
1010         return EnabledState(EnabledState.Field.Disabled, __traits(getMember, default_, DMT[0].FieldName));
1011     }
1012     else
1013     {
1014         return EnabledState(EnabledState.Field.None);
1015     }
1016 }
1017 
1018 /// Return value of `isMappingEnabled`
1019 private struct EnabledState
1020 {
1021     /// Used to determine which field controls a mapping enabled state
1022     private enum Field
1023     {
1024         /// No such field, the mapping is considered enabled
1025         None,
1026         /// The field is named 'enabled'
1027         Enabled,
1028         /// The field is named 'disabled'
1029         Disabled,
1030     }
1031 
1032     /// Check if the mapping is considered enabled
1033     public bool opCast () const scope @safe pure @nogc nothrow
1034     {
1035         return this.field == Field.None ||
1036             (this.field == Field.Enabled && this.fieldValue) ||
1037             (this.field == Field.Disabled && !this.fieldValue);
1038     }
1039 
1040     /// Type of field found
1041     private Field field;
1042 
1043     /// Value of the field, interpretation depends on `field`
1044     private bool fieldValue;
1045 }
1046 
1047 /// Evaluates to `true` if `T` is a `struct` with a default ctor
1048 private enum hasFieldwiseCtor (T) = (is(T == struct) && is(typeof(() => T(T.init.tupleof))));
1049 
1050 /// Evaluates to `true` if `T` has a static method that is designed to work with this library
1051 private enum hasFromYAML (T) = is(typeof(T.fromYAML(ConfigParser!(T).init)) : T);
1052 
1053 /// Evaluates to `true` if `T` has a static method that accepts a `string` and returns a `T`
1054 private enum hasFromString (T) = is(typeof(T.fromString(string.init)) : T);
1055 
1056 /// Evaluates to `true` if `T` is a `struct` which accepts a single string as argument
1057 private enum hasStringCtor (T) = (is(T == struct) && is(typeof(T.__ctor)) &&
1058                                   Parameters!(T.__ctor).length == 1 &&
1059                                   is(typeof(() => T(string.init))));
1060 
1061 unittest
1062 {
1063     static struct Simple
1064     {
1065         int value;
1066         string otherValue;
1067     }
1068 
1069     static assert( hasFieldwiseCtor!Simple);
1070     static assert(!hasStringCtor!Simple);
1071 
1072     static struct PubKey
1073     {
1074         ubyte[] data;
1075 
1076         this (string hex) @safe pure nothrow @nogc{}
1077     }
1078 
1079     static assert(!hasFieldwiseCtor!PubKey);
1080     static assert( hasStringCtor!PubKey);
1081 
1082     static assert(!hasFieldwiseCtor!string);
1083     static assert(!hasFieldwiseCtor!int);
1084     static assert(!hasStringCtor!string);
1085     static assert(!hasStringCtor!int);
1086 }
1087 
1088 /// Convenience function to extend a YAML path
1089 private string addPath (string opath, string newPart)
1090 in(newPart.length)
1091 do {
1092     return opath.length ? format("%s.%s", opath, newPart) : newPart;
1093 }