Parameter coercion

Shamelessly stealing and converting @theScottyJam 's idea on my previous post to a top-level topic:

I know it's common for a function that expects a string to take whatever parameter it receives and coerce it into a string, as a first step, like this:

function concat(x_, y_) {
  const x = String(x_)
  const y = String(y_)
  return x + y

What if, at any location where you're binding a value to a variable, you're allowed to pass the incoming value through a normalization function first, through, say, a "from" keyword (~I don't like this "from" word - feel free to bikeshed it~ edit: I like from!). That would make the above example equivalent to this:

function concat(x from String, y from String) {
  return x + y

It would also allow you to automatically coerce an unknown value to an error.

try {
} catch (e from Error) {

// ... is the same as ...

try {
} catch (e_) {
  let e = Error(e_)

Some more usage examples :

const normalizedDegrees = deg => deg % 360

function toRadians(deg from normalizedDegrees) {
  // ...
const positiveNumberAssertion = value => {
  if (value <= 0) throw new Error('Whoops!')
  return value

function doOperation(x from positiveNumberAssertion) {
  // ...

Use existing type validation libraries:

import { z } from 'zod'

const User = z.object({
  name: z.string(),
  email: z.string().email(),

const greetUser = (user from User.parse) => {
  console.log(`Hello ${}, your email is ${}`)

Catch only a certain class of errors:

const rethrowIfNot = type => err => {
  if (err instanceof type) {
    return err
  throw err

try {
} catch (err from rethrowIfNot(TypeError)) {
  console.log('type error!')

With IDEs/tools that provide type-hints, the parameter could have the inferred type of the return value of the "coercer" function, without having to introduce a _temp variable with an unknown type.


I've warmed up to the "from" name and like it too now ;).

Something I've seen myself to in code bases, is logic like this:

function queryEndpointHandler({ query, params: params_, ...moreParamsLikeThese }) {
  const params = JSON.parse(params_)
  // More transformation and assertions for individual parameters ...

It's a little annoying having to pollute the scope with an extra temporary binding (or mutate an existing binding, which I find to be worse), when all I'm wanting to do is immediately apply a transformation on it before I start using it everywhere in my function. This "from" keyword would help to keep the scope a little cleaner and reduce how many different variables exist in the current scope.

function queryEndpointHandler({ query, params from JSON.parse, ...etc }) {

On another note, it may be useful to chain multiple "from"s together, for example:

function f({ users from JSON.parse from assertString })

The outermost "from" would be executed first.

This indirectly creates a backwards form of the pipeline operator:

const adults
  from users => users.filter(u => u.age > 18)
  from conf => conf.users
  from JSON.parse
  = rawData

Umm, but don't actually use it that way :rofl:

1 Like
function f(x) {
  x = transform(x);

With the way this code is currently written, there is a debugging advantage in there is a clear place to put a breakpoint to see what the value is before it is transformed, and then step into the transformer.

Merging these with syntax make reduce the number of lines of code but maybe it makes code harder to follow in practice.

1 Like

This sounds like an argument against the overall idea of using different constructs/patterns, such as a fluent API or the pipeline operator, in order to reduce the number of variables you're declaring.

Compare the number of easy-to-use debug points in the following:

const users = getUsers()
const user = users.find(x => === someName)
const groups = user.groups

// vs

const groups = getUsers()
  .find(x => === someName)

Most people would rather use the second example, even though the first example is easier to debug.

People who want additional debug points can always do this:

// Do this first line if you would prefer to have something in which you can set a breakpoint on.
function f(data from x => JSON.parse(x))
// If we choose to re-evaulate the `JSON.parse` expression each time the function call happens
// (probably a good idea), then you should be able to set a debug point on just that location too.
function f(data from JSON.parse)


If I'm right, this proposal states to provide a keyword that converts the type of an argument.

Why not use to instead of from? It makes a bit more sense in my opinion.

function concatenate(x to String, y to String) {
  return x + y;


Parameter coercion is really just one use case for "from". Really, it's useful for any kind of transformation or side-effect on the incoming parameter values.

Now, consider the following example where I transform the incoming value to something completely different (you may consider this bad programming practice, but it's still a good illustration of what "from" really does).

function f({ user: groups from user => user.groups }) { ... }
function f({ user: groups to user => user.groups }) { ... }

The value passed in is a "user" object, yet the transformer is providing the function body with a binding to "groups" instead.

Note that the second line is now lying to us. If you read it, it's saying "pass groups to the user-to-group transformer" (groups to user => user.groups). That's not what's going on - we're passing a user object to this transformer and receiving a groups object as an output. The first line is what tells the truth, "receive groups from the user-to-groups transformer" (groups from user => user.groups).

1 Like

Oops, I had it wrong. Now, if I'm right, the from keyword passes the argument on the left to the function on the right and assigns the argument to the return value of that function.

In my example, the String constructor. Interesting :)

That's a valid way to think about it. Technically, the identifier on the left doesn't get bound until the "from" transformers execute, so it's a little more correct to think of it as "the incoming value passes through the transformers on the right, and the result gets bound to the identifier on the left"

The difference between those explanations is what makes this invalid:

function f(x from y => x + y) // Error! Can't access "x" in the "from" transformer, it's currently undefined.

Even though this is valid:

function f(x, y from z => z + x) // Works
1 Like

Not a fan of putting all this logic into formal parameters.

Such functions are a pain to use when you already have an object (more generally, an argument in the target form). Instead, I'd prefer both the raw function logic and the coercing wrapper be exposed.

That being said...

Assuming this is just a showcase example — otherwise it could be written as function f({ user: { groups }}) — I think it'd be better to have the srcName, transFn, dstName in this order in the code. Like:

function f({ user |> (u => u.groups): groups }) { ... }
function g({ user |> validateUser: { groups }}) { ... }

That works with object destructuring, which have the power to rename properties, but what about regular old parameters, or array destructuring? (Unless you're suggesting we allow renaming to happen in those too?) Here's another possible ordering:

function f(assertString to JSON.parse to users) { ... }
function f({ users: assertString to JSON.parse to myUsers }) { ... }
function f({ assertString to JSON.parse to users }) { ... }

(I would prefer "to" over "|>" despite the similarities, simply because the right-most value of this chain is going to be a plain identifier, not a function. I think this is too semantically different from pipelines to make that kind of link between them)

First, the incoming value goes through "assertString", then "JSON.parse", and finally gets bound to the identifier "users".

I still like "from" better though - I think it reads better to put the clutter of the transformers after the identifier, instead of trying to read through the transformers first to see what you're even dealing with. But, both work.


Really love this idea. I have a question, it might be a dumb one, but:

What happens if the "transformer" function fails in any way, throws an error?
And where do we catch it? would a try-catch be in the outer scope or in the function's scope? (in the case of parameters)

@akaizn-junior, it wouldn't be that different from when an error gets thrown in this scenario:

const requiredParam = () => { throw new Error('That param is required!') }
function fn(x = requiredParam()) {

fn() // Error: That param is required!

When an error is thrown in the parameter logic, the only realistic place that can catch is where you call the function. If you need to catch it within the function definition and handle it, then it's better to move the logic into the function body, and away from the parameter list.

1 Like

Hmm, there is an alternative way to achieve this idea.

Pattern-matching will soon let us run arbitrary logic at the location where an assignment is happening. So, with some gross hackery, this sort of thing could be possible:

const getUser = (...args) => match (args) {
  where ([${parseAsJSON} with ${assertIsString} with name]): do {

This is, however, extremely ugly. But, it's not unfeasible to have a follow-on proposal that carries the concept of with outside of pattern matching and into any location where we can do a normal binding. Any object that supports the matcher protocol could be used with this syntax.

const getUser = (${parseAsJSON} with ${assertIsString} with name) => {

The matcher protocol returns two things: 1. did the pattern match, 2. what value should be given as a result. If, the protocol says that the match failed, then we can let an error be thrown.

Edit: I updated the examples to include the interpolation syntax that I forgot about (${}). This syntax is a bit weird when we're out of pattern-matching, because we're not really escaping out of "match syntax", so perhaps this isn't as easy to carry out as I thought.

1 Like

Hi @theScottyJam,

I still find from a bit odd and guess it would clash slightly with the ES6 import ... from syntax. So instead, how about ... via.

The Oxford Dictionary defines it as "by way of; through", which is kind of what we're doing.

function concatenate(x via String, y via String) {
  return x + y;

Even though it is less syntactically problematic, it is an uncommon word and may not spark right off the bat. Is there any proposal it may clash with or is it hard to grasp? Let me know what you think.

Yeah, via could work as well. from... via... they seem about the same to me, so I'm good with either one.

1 Like

This thread is a little interesting. If

function foo(x via somefunc) {

behaves as described, does that also mean you can do this

let s = 33;
s via (x => x+9);
console.log(s); // 42

and expect it to work? It seems like l via r is supposed to be the same as l = r(l);. Did I miss something?

Not quite - the via/from "operator" is only applicable at an l-value position, similar to destructuring. At least, that's how it's been formulated thus far, certainly there's room for discussion in adding a form like that if we think it would be useful. But, the way you would need to write that in this current formulation is like this:

let s = 33;
s via (x => x+9) = s;
console.log(s); // 42

I'd argue that it makes no sense to do that. If this keyword is only supposed to be good as an l-value, then it's best to constrain it to function definitions. Otherwise:

//with sugar
s via (x => x+9) = s;  //Ugly and confusing
//sugar free
s = (x => x+9)(s) = s; //Even worse

I would argue that it can also be useful in destructuring outside of function parameters.

const {
  age: userAge via assertIsNumber,
  name: fullName via asString,
  dateOfBirth via stringToDate
} = user;

Though, admitidly, the usefulnes of that is not strong, since it's not too hard to just write that like this:

const userAge = assertIsNumber(user.age);
const fullName = asString(;
const dateOfBirth = stringToDate(user.dateOfBirth);

And, who knows, maybe this syntax shouldn't even be allowed to be combined with destructuring, so we don't need to deal with various edge cases assosiated with that. In which case, perhaps it does make sense to limit it to just function parameters.

1 Like

It's already complicated enough that when destructuring, the meaning of the object definition is inverted (declared as value:key instead of key:value), but combining that with this new construct just adds to the potential confusion. Normally x via y would resolve to (x = y(x)). However, when combined with destructuring, {value: key via fn} = obj resolves to key = fn(obj.value) which is unintuitive given the application of via in a function definition.

1 Like