1 /*******************************************************************************
2 
3     Utilities to fill a struct representing the configuration with the content
4     of a YAML document.
5 
6     The main function of this module is `parseConfig`. Convenience functions
7     `parseConfigString` and `parseConfigFile` are also available.
8 
9     The type parameter to those three functions must be a struct and is used
10     to drive the processing of the YAML node. When an error is encountered,
11     an `Exception` will be thrown, with a descriptive message.
12     The rules by which the struct is filled are designed to be
13     as intuitive as possible, and are described below.
14 
15     Optional_Fields:
16       One of the major convenience offered by this utility is its handling
17       of optional fields. A field is detected as optional if it has
18       an initializer that is different from its type `init` value,
19       for example `string field = "Something";` is an optional field,
20       but `int count = 0;` is not.
21       To mark a field as optional even with its default value,
22       use the `Optional` UDA: `@Optional int count = 0;`.
23 
24     fromYAML:
25       Because config structs may contain complex types outside of the project's
26       control (e.g. a Phobos type, Vibe.d's `URL`, etc...) or one may want
27       the config format to be more dynamic (e.g. by exposing union-like behavior),
28       one may need to apply more custom logic than what Configy does.
29       For this use case, one can define a `fromYAML` static method in the type:
30       `static S fromYAML(scope ConfigParser!S parser)`, where `S` is the type of
31       the enclosing structure. Structs with `fromYAML` will have this method
32       called instead of going through the normal parsing rules.
33       The `ConfigParser` exposes the current path of the field, as well as the
34       raw YAML `Node` itself, allowing for maximum flexibility.
35 
36     Composite_Types:
37       Processing starts from a `struct` at the top level, and recurse into
38       every fields individually. If a field is itself a struct,
39       the filler will attempt the following, in order:
40       - If the field has no value and is not optional, an Exception will
41         be thrown with an error message detailing where the issue happened.
42       - If the field has no value and is optional, the default value will
43         be used.
44       - If the field has a value, the filler will first check for a converter
45         and use it if present.
46       - If the type has a `static` method named `fromString` whose sole argument
47         is a `string`, it will be used.
48       - If the type has a constructor whose sole argument is a `string`,
49         it will be used;
50       - Finally, the filler will attempt to deserialize all struct members
51         one by one and pass them to the default constructor, if there is any.
52       - If none of the above succeeded, a `static assert` will trigger.
53 
54     Alias_this:
55       If a `struct` contains an `alias this`, the field that is aliased will be
56       ignored, instead the config parser will parse nested fields as if they
57       were part of the enclosing structure. This allow to re-use a single `struct`
58       in multiple place without having to resort to a `mixin template`.
59       Having an initializer will make all fields in the aliased struct optional.
60       The aliased field cannot have attributes other than `@Optional`,
61       which will then apply to all fields it exposes.
62 
63     Duration_parsing:
64       If the config field is of type `core.time.Duration`, special parsing rules
65       will apply. There are two possible forms in which a Duration field may
66       be expressed. In the first form, the YAML node should be a mapping,
67       and it will be checked for fields matching the supported units
68       in `core.time`: `weeks`, `days`, `hours`, `minutes`, `seconds`, `msecs`,
69       `usecs`, `hnsecs`, `nsecs`. Strict parsing option will be respected.
70       The values of the fields will then be added together, so the following
71       YAML usages are equivalent:
72       ---
73       // sleepFor:
74       //   hours: 8
75       //   minutes: 30
76       ---
77       and:
78       ---
79       // sleepFor:
80       //   minutes: 510
81       ---
82       Provided that the definition of the field is:
83       ---
84       public Duration sleepFor;
85       ---
86 
87       In the second form, the field should have a suffix composed of an
88       underscore ('_'), followed by a unit name as defined in `core.time`.
89       This can be either the field name directly, or a name override.
90       The latter is recommended to avoid confusion when using the field in code.
91       In this form, the YAML node is expected to be a scalar.
92       So the previous example, using this form, would be expressed as:
93       ---
94       sleepFor_minutes: 510
95       ---
96       and the field definition should be one of those two:
97       ---
98       public @Name("sleepFor_minutes") Duration sleepFor; /// Prefer this
99       public Duration sleepFor_minutes; /// This works too
100       ---
101 
102       Those forms are mutually exclusive, so a field with a unit suffix
103       will error out if a mapping is used. This prevents surprises and ensures
104       that the error message, if any, is consistent across user input.
105 
106       To disable or change this behavior, one may use a `Converter` instead.
107 
108     Strict_Parsing:
109       When strict parsing is enabled, the config filler will also validate
110       that the YAML nodes do not contains entry which are not present in the
111       mapping (struct) being processed.
112       This can be useful to catch typos or outdated configuration options.
113 
114     Post_Validation:
115       Some configuration will require validation across multiple sections.
116       For example, two sections may be mutually exclusive as a whole,
117       or may have fields which are mutually exclusive with another section's
118       field(s). This kind of dependence is hard to account for declaratively,
119       and does not affect parsing. For this reason, the preferred way to
120       handle those cases is to define a `validate` member method on the
121       affected config struct(s), which will be called once
122       parsing for that mapping is completed.
123       If an error is detected, this method should throw an Exception.
124 
125     Enabled_or_disabled_field:
126       While most complex logic validation should be handled post-parsing,
127       some section may be optional by default, but if provided, will have
128       required fields. To support this use case, if a field with the name
129       `enabled` is present in a struct, the parser will first process it.
130       If it is `false`, the parser will not attempt to process the struct
131       further, and the other fields will have their default value.
132       Likewise, if a field named `disabled` exists, the struct will not
133       be processed if it is set to `true`.
134 
135     Copyright:
136         Copyright (c) 2019-2022 BOSAGORA Foundation
137         All rights reserved.
138 
139     License:
140         MIT License. See LICENSE for details.
141 
142 *******************************************************************************/
143 
144 module dub.internal.configy.Read;
145 
146 public import dub.internal.configy.Attributes;
147 public import dub.internal.configy.Exceptions : ConfigException;
148 import dub.internal.configy.Exceptions;
149 import dub.internal.configy.FieldRef;
150 import dub.internal.configy.Utils;
151 
152 import dub.internal.dyaml.exception;
153 import dub.internal.dyaml.node;
154 import dub.internal.dyaml.loader;
155 
156 import std.algorithm;
157 import std.conv;
158 import std.datetime;
159 import std.format;
160 import std.getopt;
161 import std.meta;
162 import std.range;
163 import std.traits;
164 import std.typecons : Nullable, nullable, tuple;
165 
166 static import core.time;
167 
168 // Dub-specific adjustments for output
169 import dub.internal.logging;
170 
171 /// Command-line arguments
172 public struct CLIArgs
173 {
174     /// Path to the config file
175     public string config_path = "config.yaml";
176 
177     /// Overrides for config options
178     public string[][string] overrides;
179 
180     /// Helper to add items to `overrides`
181     public void overridesHandler (string, string value)
182     {
183         import std.string;
184         const idx = value.indexOf('=');
185         if (idx < 0) return;
186         string k = value[0 .. idx], v = value[idx + 1 .. $];
187         if (auto val = k in this.overrides)
188             (*val) ~= v;
189         else
190             this.overrides[k] = [ v ];
191     }
192 
193     /***************************************************************************
194 
195         Parses the base command line arguments
196 
197         This can be composed with the program argument.
198         For example, consider a program which wants to expose a `--version`
199         switch, the definition could look like this:
200         ---
201         public struct ProgramCLIArgs
202         {
203             public CLIArgs base; // This struct
204 
205             public alias base this; // For convenience
206 
207             public bool version_; // Program-specific part
208         }
209         ---
210         Then, an application-specific configuration routine would be:
211         ---
212         public GetoptResult parse (ref ProgramCLIArgs clargs, ref string[] args)
213         {
214             auto r = clargs.base.parse(args);
215             if (r.helpWanted) return r;
216             return getopt(
217                 args,
218                 "version", "Print the application version, &clargs.version_");
219         }
220         ---
221 
222         Params:
223           args = The command line args to parse (parsed options will be removed)
224           passThrough = Whether to enable `config.passThrough` and
225                         `config.keepEndOfOptions`. `true` by default, to allow
226                         composability. If your program doesn't have other
227                         arguments, pass `false`.
228 
229         Returns:
230           The result of calling `getopt`
231 
232     ***************************************************************************/
233 
234     public GetoptResult parse (ref string[] args, bool passThrough = true)
235     {
236         return getopt(
237             args,
238             // `caseInsensitive` is the default, but we need something
239             // with the same type for the ternary
240             passThrough ? config.keepEndOfOptions : config.caseInsensitive,
241             // Also the default, same reasoning
242             passThrough ? config.passThrough : config.noPassThrough,
243             "config|c",
244                 "Path to the config file. Defaults to: " ~ this.config_path,
245                 &this.config_path,
246 
247             "override|O",
248                 "Override a config file value\n" ~
249                 "Example: -O foo.bar=true -o dns=1.1.1.1 -o dns=2.2.2.2\n" ~
250                 "Array values are additive, other items are set to the last override",
251                 &this.overridesHandler,
252         );
253     }
254 }
255 
256 /*******************************************************************************
257 
258     Attempt to read and process the config file at `path`, print any error
259 
260     This 'simple' overload of the more detailed `parseConfigFile` will attempt
261     to read the file at `path`, and return a `Nullable` instance of it.
262     If an error happens, either because the file isn't readable or
263     the configuration has an issue, a message will be printed to `stderr`,
264     with colors if the output is a TTY, and a `null` instance will be returned.
265 
266     The calling code can hence just read a config file via:
267     ```
268     int main ()
269     {
270         auto configN = parseConfigFileSimple!Config("config.yaml");
271         if (configN.isNull()) return 1; // Error path
272         auto config = configN.get();
273         // Rest of the program ...
274     }
275     ```
276     An overload accepting `CLIArgs args` also exists.
277 
278     Params:
279         path = Path of the file to read from
280         args = Command line arguments on which `parse` has been called
281         strict = Whether the parsing should reject unknown keys in the
282                  document, warn, or ignore them (default: `StrictMode.Error`)
283 
284     Returns:
285         An initialized `Config` instance if reading/parsing was successful;
286         a `null` instance otherwise.
287 
288 *******************************************************************************/
289 
290 public Nullable!T parseConfigFileSimple (T) (string path, StrictMode strict = StrictMode.Error)
291 {
292     return parseConfigFileSimple!(T)(CLIArgs(path), strict);
293 }
294 
295 
296 /// Ditto
297 public Nullable!T parseConfigFileSimple (T) (in CLIArgs args, StrictMode strict = StrictMode.Error)
298 {
299     return wrapException(parseConfigFile!T(args, strict));
300 }
301 
302 /// Ditto
303 public Nullable!T wrapException (T) (lazy T parseCall)
304 {
305     try
306         return nullable(parseCall);
307     catch (ConfigException exc)
308     {
309         exc.printException();
310         return typeof(return).init;
311     }
312     catch (Exception exc)
313     {
314         // Other Exception type may be thrown by D-YAML,
315         // they won't include rich information.
316         logWarn("%s", exc.message());
317         return typeof(return).init;
318     }
319 }
320 
321 /*******************************************************************************
322 
323     Print an Exception, potentially with colors on
324 
325     Trusted because of `stderr` usage.
326 
327 *******************************************************************************/
328 
329 private void printException (scope ConfigException exc) @trusted
330 {
331     import dub.internal.logging;
332 
333     if (hasColors)
334         logWarn("%S", exc);
335     else
336         logWarn("%s", exc.message());
337 }
338 
339 /*******************************************************************************
340 
341     Parses the config file or string and returns a `Config` instance.
342 
343     Params:
344         cmdln = command-line arguments (containing the path to the config)
345         path = When parsing a string, the path corresponding to it
346         strict = Whether the parsing should reject unknown keys in the
347                  document, warn, or ignore them (default: `StrictMode.Error`)
348 
349     Throws:
350         `Exception` if parsing the config file failed.
351 
352     Returns:
353         `Config` instance
354 
355 *******************************************************************************/
356 
357 public T parseConfigFile (T) (in CLIArgs cmdln, StrictMode strict = StrictMode.Error)
358 {
359     Node root = Loader.fromFile(cmdln.config_path).load();
360     return parseConfig!T(cmdln, root, strict);
361 }
362 
363 /// ditto
364 public T parseConfigString (T) (string data, string path, StrictMode strict = StrictMode.Error)
365 {
366     CLIArgs cmdln = { config_path: path };
367     auto loader = Loader.fromString(data);
368     loader.name = path;
369     Node root = loader.load();
370     return parseConfig!T(cmdln, root, strict);
371 }
372 
373 /*******************************************************************************
374 
375     Process the content of the YAML document described by `node` into an
376     instance of the struct `T`.
377 
378     See the module description for a complete overview of this function.
379 
380     Params:
381       T = Type of the config struct to fill
382       cmdln = Command line arguments
383       node = The root node matching `T`
384       strict = Action to take when encountering unknown keys in the document
385 
386     Returns:
387       An instance of `T` filled with the content of `node`
388 
389     Throws:
390       If the content of `node` cannot satisfy the requirements set by `T`,
391       or if `node` contain extra fields and `strict` is `true`.
392 
393 *******************************************************************************/
394 
395 public T parseConfig (T) (
396     in CLIArgs cmdln, Node node, StrictMode strict = StrictMode.Error)
397 {
398     static assert(is(T == struct), "`" ~ __FUNCTION__ ~
399                   "` should only be called with a `struct` type as argument, not: `" ~
400                   fullyQualifiedName!T ~ "`");
401 
402     final switch (node.nodeID)
403     {
404     case NodeID.mapping:
405             dbgWrite("Parsing config '%s', strict: %s",
406                      fullyQualifiedName!T,
407                      strict == StrictMode.Warn ?
408                        strict.paint(Yellow) : strict.paintIf(!!strict, Green, Red));
409             return node.parseField!(StructFieldRef!T)(
410                 null, T.init, const(Context)(cmdln, strict));
411     case NodeID.sequence:
412     case NodeID.scalar:
413     case NodeID.invalid:
414         throw new TypeConfigException(node, "mapping (object)", "document root");
415     }
416 }
417 
418 /*******************************************************************************
419 
420     The behavior to have when encountering a field in YAML not present
421     in the config definition.
422 
423 *******************************************************************************/
424 
425 public enum StrictMode
426 {
427     /// Issue an error by throwing an `UnknownKeyConfigException`
428     Error  = 0,
429     /// Write a message to `stderr`, but continue processing the file
430     Warn   = 1,
431     /// Be silent and do nothing
432     Ignore = 2,
433 }
434 
435 /// Used to pass around configuration
436 package struct Context
437 {
438     ///
439     private CLIArgs cmdln;
440 
441     ///
442     private StrictMode strict;
443 }
444 
445 /*******************************************************************************
446 
447     Parse a mapping from `node` into an instance of `T`
448 
449     Params:
450       TLFR = Top level field reference for this mapping
451       node = The YAML node object matching the struct being read
452       path = The runtime path to this mapping, used for nested types
453       defaultValue = The default value to use for `T`, which can be different
454                      from `T.init` when recursing into fields with initializers.
455       ctx = A context where properties that need to be conserved during
456             recursion are stored
457       fieldDefaults = Default value for some fields, used for `Key` recursion
458 
459 *******************************************************************************/
460 private TLFR.Type parseMapping (alias TLFR)
461     (Node node, string path, auto ref TLFR.Type defaultValue,
462      in Context ctx, in Node[string] fieldDefaults)
463 {
464     static assert(is(TLFR.Type == struct), "`parseMapping` called with wrong type (should be a `struct`)");
465     assert(node.nodeID == NodeID.mapping, "Internal error: parseMapping shouldn't have been called");
466 
467     dbgWrite("%s: `parseMapping` called for '%s' (node entries: %s)",
468              TLFR.Type.stringof.paint(Cyan), path.paint(Cyan),
469              node.length.paintIf(!!node.length, Green, Red));
470 
471     static foreach (FR; FieldRefTuple!(TLFR.Type))
472     {
473         static if (FR.Name != FR.FieldName && hasMember!(TLFR.Type, FR.Name) &&
474                    !is(typeof(mixin("TLFR.Type.", FR.Name)) == function))
475             static assert (FieldRef!(TLFR.Type, FR.Name).Name != FR.Name,
476                            "Field `" ~ FR.FieldName ~ "` `@Name` attribute shadows field `" ~
477                            FR.Name ~ "` in `" ~ TLFR.Type.stringof ~ "`: Add a `@Name` attribute to `" ~
478                            FR.Name ~ "` or change that of `" ~ FR.FieldName ~ "`");
479     }
480 
481     if (ctx.strict != StrictMode.Ignore)
482     {
483         /// First, check that all the sections found in the mapping are present in the type
484         /// If not, the user might have made a typo.
485         immutable string[] fieldNames = [ FieldsName!(TLFR.Type) ];
486         immutable string[] patterns = [ Patterns!(TLFR.Type) ];
487     FIELD: foreach (const ref Node key, const ref Node value; node)
488         {
489             const k = key.as!string;
490             if (!fieldNames.canFind(k))
491             {
492                 foreach (p; patterns)
493                     if (k.startsWith(p))
494                         // Require length because `0` would match `canFind`
495                         // and we don't want to allow `$PATTERN-`
496                         if (k[p.length .. $].length > 1 && k[p.length] == '-')
497                             continue FIELD;
498 
499                 if (ctx.strict == StrictMode.Warn)
500                 {
501                     scope exc = new UnknownKeyConfigException(
502                         path, key.as!string, fieldNames, key.startMark());
503                     exc.printException();
504                 }
505                 else
506                     throw new UnknownKeyConfigException(
507                         path, key.as!string, fieldNames, key.startMark());
508             }
509         }
510     }
511 
512     const enabledState = node.isMappingEnabled!(TLFR.Type)(defaultValue);
513 
514     if (enabledState.field != EnabledState.Field.None)
515         dbgWrite("%s: Mapping is enabled: %s", TLFR.Type.stringof.paint(Cyan), (!!enabledState).paintBool());
516 
517     auto convertField (alias FR) ()
518     {
519         static if (FR.Name != FR.FieldName)
520             dbgWrite("Field name `%s` will use YAML field `%s`",
521                      FR.FieldName.paint(Yellow), FR.Name.paint(Green));
522         // Using exact type here matters: we could get a qualified type
523         // (e.g. `immutable(string)`) if the field is qualified,
524         // which causes problems.
525         FR.Type default_ = __traits(getMember, defaultValue, FR.FieldName);
526 
527         // If this struct is disabled, do not attempt to parse anything besides
528         // the `enabled` / `disabled` field.
529         if (!enabledState)
530         {
531             // Even this is too noisy
532             version (none)
533                 dbgWrite("%s: %s field of disabled struct, default: %s",
534                          path.paint(Cyan), "Ignoring".paint(Yellow), default_);
535 
536             static if (FR.Name == "enabled")
537                 return false;
538             else static if (FR.Name == "disabled")
539                 return true;
540             else
541                 return default_;
542         }
543 
544         if (auto ptr = FR.FieldName in fieldDefaults)
545         {
546             dbgWrite("Found %s (%s.%s) in `fieldDefaults`",
547                      FR.Name.paint(Cyan), path.paint(Cyan), FR.FieldName.paint(Cyan));
548 
549             if (ctx.strict && FR.FieldName in node)
550                 throw new ConfigExceptionImpl("'Key' field is specified twice", path, FR.FieldName, node.startMark());
551             return (*ptr).parseField!(FR)(path.addPath(FR.FieldName), default_, ctx)
552                 .dbgWriteRet("Using value '%s' from fieldDefaults for field '%s'",
553                              FR.FieldName.paint(Cyan));
554         }
555 
556         // This, `FR.Pattern`, and the field in `@Name` are special support for `dub`
557         static if (FR.Pattern)
558         {
559             static if (is(FR.Type : V[K], K, V))
560             {
561                 alias AAFieldRef = NestedFieldRef!(V, FR);
562                 static assert(is(K : string), "Key type should be string-like");
563             }
564             else
565                 static assert(0, "Cannot have pattern on non-AA field");
566 
567             AAFieldRef.Type[string] result;
568             foreach (pair; node.mapping)
569             {
570                 const key = pair.key.as!string;
571                 if (!key.startsWith(FR.Name))
572                     continue;
573                 string suffix = key[FR.Name.length .. $];
574                 if (suffix.length)
575                 {
576                     if (suffix[0] == '-') suffix = suffix[1 .. $];
577                     else continue;
578                 }
579 
580                 result[suffix] = pair.value.parseField!(AAFieldRef)(
581                     path.addPath(key), default_.get(key, AAFieldRef.Type.init), ctx);
582             }
583             bool hack = true;
584             if (hack) return result;
585         }
586 
587         if (auto ptr = FR.Name in node)
588         {
589             dbgWrite("%s: YAML field is %s in node%s",
590                      FR.Name.paint(Cyan), "present".paint(Green),
591                      (FR.Name == FR.FieldName ? "" : " (note that field name is overriden)").paint(Yellow));
592             return (*ptr).parseField!(FR)(path.addPath(FR.Name), default_, ctx)
593                 .dbgWriteRet("Using value '%s' from YAML document for field '%s'",
594                              FR.FieldName.paint(Cyan));
595         }
596 
597         dbgWrite("%s: Field is %s from node%s",
598                  FR.Name.paint(Cyan), "missing".paint(Red),
599                  (FR.Name == FR.FieldName ? "" : " (note that field name is overriden)").paint(Yellow));
600 
601         // A field is considered optional if it has an initializer that is different
602         // from its default value, or if it has the `Optional` UDA.
603         // In that case, just return this value.
604         static if (FR.Optional)
605             return default_
606                 .dbgWriteRet("Using default value '%s' for optional field '%s'", FR.FieldName.paint(Cyan));
607 
608         // The field is not present, but it could be because it is an optional section.
609         // For example, the section could be defined as:
610         // ---
611         // struct RequestLimit { size_t reqs = 100; }
612         // struct Config { RequestLimit limits; }
613         // ---
614         // In this case we need to recurse into `RequestLimit` to check if any
615         // of its field is required.
616         else static if (mightBeOptional!FR)
617         {
618             const npath = path.addPath(FR.Name);
619             string[string] aa;
620             return Node(aa).parseMapping!(FR)(npath, default_, ctx, null);
621         }
622         else
623             throw new MissingKeyException(path, FR.Name, node.startMark());
624     }
625 
626     FR.Type convert (alias FR) ()
627     {
628         static if (__traits(getAliasThis, TLFR.Type).length == 1 &&
629                    __traits(getAliasThis, TLFR.Type)[0] == FR.FieldName)
630         {
631             static assert(FR.Name == FR.FieldName,
632                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
633                           "` is the target of an `alias this` and cannot have a `@Name` attribute");
634             static assert(!hasConverter!(FR.Ref),
635                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
636                           "` is the target of an `alias this` and cannot have a `@Converter` attribute");
637 
638             alias convertW(string FieldName) = convert!(FieldRef!(FR.Type, FieldName, FR.Optional));
639             return FR.Type(staticMap!(convertW, FieldNameTuple!(FR.Type)));
640         }
641         else
642             return convertField!(FR)();
643     }
644 
645     debug (ConfigFillerDebug)
646     {
647         indent++;
648         scope (exit) indent--;
649     }
650 
651     TLFR.Type doValidation (TLFR.Type result)
652     {
653         static if (is(typeof(result.validate())))
654         {
655             if (enabledState)
656             {
657                 dbgWrite("%s: Calling `%s` method",
658                      TLFR.Type.stringof.paint(Cyan), "validate()".paint(Green));
659                 result.validate();
660             }
661             else
662             {
663                 dbgWrite("%s: Ignoring `%s` method on disabled mapping",
664                          TLFR.Type.stringof.paint(Cyan), "validate()".paint(Green));
665             }
666         }
667         else if (enabledState)
668             dbgWrite("%s: No `%s` method found",
669                      TLFR.Type.stringof.paint(Cyan), "validate()".paint(Yellow));
670 
671         return result;
672     }
673 
674     // This might trigger things like "`this` is not accessible".
675     // In this case, the user most likely needs to provide a converter.
676     alias convertWrapper(string FieldName) = convert!(FieldRef!(TLFR.Type, FieldName));
677     return doValidation(TLFR.Type(staticMap!(convertWrapper, FieldNameTuple!(TLFR.Type))));
678 }
679 
680 /*******************************************************************************
681 
682     Parse a field, trying to match up the compile-time expectation with
683     the run time value of the Node (`nodeID`).
684 
685     This is the central point which does "type conversion", from the YAML node
686     to the field type. Whenever adding support for a new type, things should
687     happen here.
688 
689     Because a `struct` can be filled from either a mapping or a scalar,
690     this function will first try the converter / fromString / string ctor
691     methods before defaulting to field-wise construction.
692 
693     Note that optional fields are checked before recursion happens,
694     so this method does not do this check.
695 
696 *******************************************************************************/
697 
698 package FR.Type parseField (alias FR)
699     (Node node, string path, auto ref FR.Type defaultValue, in Context ctx)
700 {
701     if (node.nodeID == NodeID.invalid)
702         throw new TypeConfigException(node, "valid", path);
703 
704     // If we reached this, it means the field is set, so just recurse
705     // to peel the type
706     static if (is(FR.Type : SetInfo!FT, FT))
707         return FR.Type(
708             parseField!(FieldRef!(FR.Type, "value"))(node, path, defaultValue, ctx),
709             true);
710 
711     else static if (hasConverter!(FR.Ref))
712         return wrapException(node.viaConverter!(FR)(path, ctx), path, node.startMark());
713 
714     else static if (hasFromYAML!(FR.Type))
715     {
716         scope impl = new ConfigParserImpl!(FR.Type)(node, path, ctx);
717         return wrapException(FR.Type.fromYAML(impl), path, node.startMark());
718     }
719 
720     else static if (hasFromString!(FR.Type))
721         return wrapException(FR.Type.fromString(node.as!string), path, node.startMark());
722 
723     else static if (hasStringCtor!(FR.Type))
724         return wrapException(FR.Type(node.as!string), path, node.startMark());
725 
726     else static if (is(immutable(FR.Type) == immutable(core.time.Duration)))
727     {
728         if (node.nodeID != NodeID.mapping)
729             throw new DurationTypeConfigException(node, path);
730         return node.parseMapping!(StructFieldRef!DurationMapping)(
731             path, DurationMapping.make(defaultValue), ctx, null).opCast!Duration;
732     }
733 
734     else static if (is(FR.Type == struct))
735     {
736         if (node.nodeID != NodeID.mapping)
737             throw new TypeConfigException(node, "mapping (object)", path);
738         return node.parseMapping!(FR)(path, defaultValue, ctx, null);
739     }
740 
741     // Handle string early as they match the sequence rule too
742     else static if (isSomeString!(FR.Type))
743         // Use `string` type explicitly because `Variant` thinks
744         // `immutable(char)[]` (aka `string`) and `immutable(char[])`
745         // (aka `immutable(string)`) are not compatible.
746         return node.parseScalar!(string)(path);
747     // Enum too, as their base type might be an array (including strings)
748     else static if (is(FR.Type == enum))
749         return node.parseScalar!(FR.Type)(path);
750 
751     else static if (is(FR.Type : E[K], E, K))
752     {
753         if (node.nodeID != NodeID.mapping)
754             throw new TypeConfigException(node, "mapping (associative array)", path);
755 
756         // Note: As of June 2022 (DMD v2.100.0), associative arrays cannot
757         // have initializers, hence their UX for config is less optimal.
758         return node.mapping().map!(
759                 (Node.Pair pair) {
760                     return tuple(
761                         pair.key.get!K,
762                         pair.value.parseField!(NestedFieldRef!(E, FR))(
763                             format("%s[%s]", path, pair.key.as!string), E.init, ctx));
764                 }).assocArray();
765 
766     }
767     else static if (is(FR.Type : E[], E))
768     {
769         static if (hasUDA!(FR.Ref, Key))
770         {
771             static assert(getUDAs!(FR.Ref, Key).length == 1,
772                           "`" ~ fullyQualifiedName!(FR.Ref) ~
773                           "` field shouldn't have more than one `Key` attribute");
774             static assert(is(E == struct),
775                           "Field `" ~ fullyQualifiedName!(FR.Ref) ~
776                           "` has a `Key` attribute, but is a sequence of `" ~
777                           fullyQualifiedName!E ~ "`, not a sequence of `struct`");
778 
779             string key = getUDAs!(FR.Ref, Key)[0].name;
780 
781             if (node.nodeID != NodeID.mapping && node.nodeID != NodeID.sequence)
782                 throw new TypeConfigException(node, "mapping (object) or sequence", path);
783 
784             if (node.nodeID == NodeID.mapping) return node.mapping().map!(
785                 (Node.Pair pair) {
786                     if (pair.value.nodeID != NodeID.mapping)
787                         throw new TypeConfigException(
788                             "sequence of " ~ pair.value.nodeTypeString(),
789                             "sequence of mapping (array of objects)",
790                             path, null, node.startMark());
791 
792                     return pair.value.parseMapping!(StructFieldRef!E)(
793                         path.addPath(pair.key.as!string),
794                         E.init, ctx, key.length ? [ key: pair.key ] : null);
795                 }).array();
796         }
797         if (node.nodeID != NodeID.sequence)
798             throw new TypeConfigException(node, "sequence (array)", path);
799 
800         typeof(return) validateLength (E[] res)
801         {
802             static if (is(FR.Type : E_[k], E_, size_t k))
803             {
804                 if (res.length != k)
805                     throw new ArrayLengthException(
806                         res.length, k, path, null, node.startMark());
807                 return res[0 .. k];
808             }
809             else
810                 return res;
811         }
812 
813         // We pass `E.init` as default value as it is not going to be used:
814         // Either there is something in the YAML document, and that will be
815         // converted, or `sequence` will not iterate.
816         return validateLength(
817             node.sequence.enumerate.map!(
818             kv => kv.value.parseField!(NestedFieldRef!(E, FR))(
819                 format("%s[%s]", path, kv.index), E.init, ctx))
820             .array()
821         );
822     }
823     else
824     {
825         static assert (!is(FR.Type == union),
826                        "`union` are not supported. Use a converter instead");
827         return node.parseScalar!(FR.Type)(path);
828     }
829 }
830 
831 /// Parse a node as a scalar
832 private T parseScalar (T) (Node node, string path)
833 {
834     if (node.nodeID != NodeID.scalar)
835         throw new TypeConfigException(node, "scalar (value)", path);
836 
837     static if (is(T == enum))
838         return node.as!string.to!(T);
839     else
840         return node.as!(T);
841 }
842 
843 /*******************************************************************************
844 
845     Write a potentially throwing user-provided expression in ConfigException
846 
847     The user-provided hooks may throw (e.g. `fromString / the constructor),
848     and the error may or may not be clear. We can't do anything about a bad
849     message but we can wrap the thrown exception in a `ConfigException`
850     to provide the location in the yaml file where the error happened.
851 
852     Params:
853       exp = The expression that may throw
854       path = Path within the config file of the field
855       position = Position of the node in the YAML file
856       file = Call site file (otherwise the message would point to this function)
857       line = Call site line (see `file` reasoning)
858 
859     Returns:
860       The result of `exp` evaluation.
861 
862 *******************************************************************************/
863 
864 private T wrapException (T) (lazy T exp, string path, Mark position,
865     string file = __FILE__, size_t line = __LINE__)
866 {
867     try
868         return exp;
869     catch (ConfigException exc)
870         throw exc;
871     catch (Exception exc)
872         throw new ConstructionException(exc, path, position, file, line);
873 }
874 
875 /// Allows us to reuse parseMapping and strict parsing
876 private struct DurationMapping
877 {
878     public SetInfo!long weeks;
879     public SetInfo!long days;
880     public SetInfo!long hours;
881     public SetInfo!long minutes;
882     public SetInfo!long seconds;
883     public SetInfo!long msecs;
884     public SetInfo!long usecs;
885     public SetInfo!long hnsecs;
886     public SetInfo!long nsecs;
887 
888     private static DurationMapping make (Duration def) @safe pure nothrow @nogc
889     {
890         typeof(return) result;
891         auto fullSplit = def.split();
892         result.weeks = SetInfo!long(fullSplit.weeks, fullSplit.weeks != 0);
893         result.days = SetInfo!long(fullSplit.days, fullSplit.days != 0);
894         result.hours = SetInfo!long(fullSplit.hours, fullSplit.hours != 0);
895         result.minutes = SetInfo!long(fullSplit.minutes, fullSplit.minutes != 0);
896         result.seconds = SetInfo!long(fullSplit.seconds, fullSplit.seconds != 0);
897         result.msecs = SetInfo!long(fullSplit.msecs, fullSplit.msecs != 0);
898         result.usecs = SetInfo!long(fullSplit.usecs, fullSplit.usecs != 0);
899         result.hnsecs = SetInfo!long(fullSplit.hnsecs, fullSplit.hnsecs != 0);
900         // nsecs is ignored by split as it's not representable in `Duration`
901         return result;
902     }
903 
904     ///
905     public void validate () const @safe
906     {
907         // That check should never fail, as the YAML parser would error out,
908         // but better be safe than sorry.
909         foreach (field; this.tupleof)
910             if (field.set)
911                 return;
912 
913         throw new Exception(
914             "Expected at least one of the components (weeks, days, hours, " ~
915             "minutes, seconds, msecs, usecs, hnsecs, nsecs) to be set");
916     }
917 
918     ///  Allow conversion to a `Duration`
919     public Duration opCast (T : Duration) () const scope @safe pure nothrow @nogc
920     {
921         return core.time.weeks(this.weeks) + core.time.days(this.days) +
922             core.time.hours(this.hours) + core.time.minutes(this.minutes) +
923             core.time.seconds(this.seconds) + core.time.msecs(this.msecs) +
924             core.time.usecs(this.usecs) + core.time.hnsecs(this.hnsecs) +
925             core.time.nsecs(this.nsecs);
926     }
927 }
928 
929 /// Evaluates to `true` if we should recurse into the struct via `parseMapping`
930 private enum mightBeOptional (alias FR) = is(FR.Type == struct) &&
931     !is(immutable(FR.Type) == immutable(core.time.Duration)) &&
932     !hasConverter!(FR.Ref) && !hasFromString!(FR.Type) &&
933     !hasStringCtor!(FR.Type) && !hasFromYAML!(FR.Type);
934 
935 /// Convenience template to check for the presence of converter(s)
936 private enum hasConverter (alias Field) = hasUDA!(Field, Converter);
937 
938 /// Provided a field reference `FR` which is known to have at least one converter,
939 /// perform basic checks and return the value after applying the converter.
940 private auto viaConverter (alias FR) (Node node, string path, in Context context)
941 {
942     enum Converters = getUDAs!(FR.Ref, Converter);
943     static assert (Converters.length,
944                    "Internal error: `viaConverter` called on field `" ~
945                    FR.FieldName ~ "` with no converter");
946 
947     static assert(Converters.length == 1,
948                   "Field `" ~ FR.FieldName ~ "` cannot have more than one `Converter`");
949 
950     scope impl = new ConfigParserImpl!(FR.Type)(node, path, context);
951     return Converters[0].converter(impl);
952 }
953 
954 private final class ConfigParserImpl (T) : ConfigParser!T
955 {
956     private Node node_;
957     private string path_;
958     private const(Context) context_;
959 
960     /// Ctor
961     public this (Node n, string p, const Context c) scope @safe pure nothrow @nogc
962     {
963         this.node_ = n;
964         this.path_ = p;
965         this.context_ = c;
966     }
967 
968     public final override inout(Node) node () inout @safe pure nothrow @nogc
969     {
970         return this.node_;
971     }
972 
973     public final override string path () const @safe pure nothrow @nogc
974     {
975         return this.path_;
976     }
977 
978     protected final override const(Context) context () const @safe pure nothrow @nogc
979     {
980         return this.context_;
981     }
982 }
983 
984 /// Helper predicate
985 private template NameIs (string searching)
986 {
987     enum bool Pred (alias FR) = (searching == FR.Name);
988 }
989 
990 /// Returns whether or not the field has a `enabled` / `disabled` field,
991 /// and its value. If it does not, returns `true`.
992 private EnabledState isMappingEnabled (M) (Node node, auto ref M default_)
993 {
994     import std.meta : Filter;
995 
996     alias EMT = Filter!(NameIs!("enabled").Pred, FieldRefTuple!M);
997     alias DMT = Filter!(NameIs!("disabled").Pred, FieldRefTuple!M);
998 
999     static if (EMT.length)
1000     {
1001         static assert (DMT.length == 0,
1002                        "`enabled` field `" ~ EMT[0].FieldName ~
1003                        "` conflicts with `disabled` field `" ~ DMT[0].FieldName ~ "`");
1004 
1005         if (auto ptr = "enabled" in node)
1006             return EnabledState(EnabledState.Field.Enabled, (*ptr).as!bool);
1007         return EnabledState(EnabledState.Field.Enabled, __traits(getMember, default_, EMT[0].FieldName));
1008     }
1009     else static if (DMT.length)
1010     {
1011         if (auto ptr = "disabled" in node)
1012             return EnabledState(EnabledState.Field.Disabled, (*ptr).as!bool);
1013         return EnabledState(EnabledState.Field.Disabled, __traits(getMember, default_, DMT[0].FieldName));
1014     }
1015     else
1016     {
1017         return EnabledState(EnabledState.Field.None);
1018     }
1019 }
1020 
1021 /// Return value of `isMappingEnabled`
1022 private struct EnabledState
1023 {
1024     /// Used to determine which field controls a mapping enabled state
1025     private enum Field
1026     {
1027         /// No such field, the mapping is considered enabled
1028         None,
1029         /// The field is named 'enabled'
1030         Enabled,
1031         /// The field is named 'disabled'
1032         Disabled,
1033     }
1034 
1035     /// Check if the mapping is considered enabled
1036     public bool opCast () const scope @safe pure @nogc nothrow
1037     {
1038         return this.field == Field.None ||
1039             (this.field == Field.Enabled && this.fieldValue) ||
1040             (this.field == Field.Disabled && !this.fieldValue);
1041     }
1042 
1043     /// Type of field found
1044     private Field field;
1045 
1046     /// Value of the field, interpretation depends on `field`
1047     private bool fieldValue;
1048 }
1049 
1050 /// Evaluates to `true` if `T` is a `struct` with a default ctor
1051 private enum hasFieldwiseCtor (T) = (is(T == struct) && is(typeof(() => T(T.init.tupleof))));
1052 
1053 /// Evaluates to `true` if `T` has a static method that is designed to work with this library
1054 private enum hasFromYAML (T) = is(typeof(T.fromYAML(ConfigParser!(T).init)) : T);
1055 
1056 /// Evaluates to `true` if `T` has a static method that accepts a `string` and returns a `T`
1057 private enum hasFromString (T) = is(typeof(T.fromString(string.init)) : T);
1058 
1059 /// Evaluates to `true` if `T` is a `struct` which accepts a single string as argument
1060 private enum hasStringCtor (T) = (is(T == struct) && is(typeof(T.__ctor)) &&
1061                                   Parameters!(T.__ctor).length == 1 &&
1062                                   is(typeof(() => T(string.init))));
1063 
1064 unittest
1065 {
1066     static struct Simple
1067     {
1068         int value;
1069         string otherValue;
1070     }
1071 
1072     static assert( hasFieldwiseCtor!Simple);
1073     static assert(!hasStringCtor!Simple);
1074 
1075     static struct PubKey
1076     {
1077         ubyte[] data;
1078 
1079         this (string hex) @safe pure nothrow @nogc{}
1080     }
1081 
1082     static assert(!hasFieldwiseCtor!PubKey);
1083     static assert( hasStringCtor!PubKey);
1084 
1085     static assert(!hasFieldwiseCtor!string);
1086     static assert(!hasFieldwiseCtor!int);
1087     static assert(!hasStringCtor!string);
1088     static assert(!hasStringCtor!int);
1089 }
1090 
1091 /// Convenience function to extend a YAML path
1092 private string addPath (string opath, string newPart)
1093 in(newPart.length)
1094 do {
1095     return opath.length ? format("%s.%s", opath, newPart) : newPart;
1096 }