Did you know that model filtering is implemented by METHODS more or less unique in every modeling software? But the filtering PRODUCT is always (OK, caveats, etc.) the same: a LIST OF GUIDS.
The proposed TGN Open Code accounts for method differences. Unique feature methods, for TGN core features 1 through 8, are reconciled through a code transformation layer down into a standardization layer (TGN OPEN CODE) to deliver a package that’s recognizable in other apps that implement TGN OPEN CODE, so, TGN rigs are portable from app to app, with graphics fidelity intact, at least to the minimum standard defined by the TGN OPEN CODE:

Look. TGN is CODE you already have in your modeling app.
It’s just not packaged together there for coherent function for visual engagement. TGN is about making that function easy, accessible, and clarifying. And it’s also about making it portable, so once you have it, you can share that into other modeling apps and platforms.
I’m asking software companies and independent developers to join a project to do this development together open source for the benefit of all modeling apps and their users. TGN will improve how people engage with models, making it easier for more people to make better use of them.
It’s also a new base for further innovation in this direction for anyone who wants to add/extend/differentiate.
More on this development proposal coming up. Stay tuned. And, don’t wait either. Contact me if you’re interested.

Demo of these 8 TGN features (mockup), specification, and detailed discussion articles are on the Tangerine website here and with more on the blog:
https://tangerinefocus.com/blog-tangerine/
Let’s strengthen human engagement with models by building better equipment for doing that.
Let’s create an industry-wide development team to build this for use in all modeling apps/platforms. Interested software companies and independent developers are invited. Contact me if you want to help build this.
The idea is to start with an early adopter developer community and then introduce that community and its codebase to one of the existing open software foundations…
AEC attention (focus) technique:
ATTENTION TECHNIQUE | links | |
---|---|---|
1 | technical drawing (+ fusion in mental model) | tradition |
2 | technical drawing (+ fusion in digital model) | digital fusion |
3 | BCF in digital model | format standard |
4 | cinematic camera rigging | camera in film (history) |
5 | TGN in digital model | TanGeriNe format proposed |
TGN is Tangerine’s proposed open source software development project. TGN is a fusion of the 4 predecessor techniques listed above it in the table.
The fusion transforms its constituent components as it combines them. Something new is made of the parts as each of the parts is made new in fusion. Here’s a good overview of the proposal. The page includes links to download the developer specification, mockup demo videos, and links to several articles discussing the TGN proposal in detail:
Attention technique TGN OPEN CODE, a package that expresses:
1/ in a coordinate system in a model
2/ within a scope/bounding box (or more complex volume)
3/ looking at the designated target face(s) of the scope from the “normal direction”
4/ which is one direction among many relevant directions of view (so, with UI for controlled view variation) – cinematic camera rigging built-in to the attention focusing rigs within model
5/ with the model/twin filtered by relevant filtering criteria
6/ with some style of display applied
7/ with some clarifying remarks or graphics added (authored within the TGN rigs or displayed in the rigs via external link)
8/ and with this feature package shareable with adequate fidelity to other modeling apps
TGN (itself) is transforming thought into action
That’s what TGN is: transformation of modeled and real reality through engagement (as articulated attention) which induces thought that transforms the whole stream into action/work/result.
TGN is itself transforming, from just me thinking/writing/demonstrating/talking about it, to software companies collaborating to make it real in software you already use.
This is getting started.
There will be many opportunities to contribute to this and shape it.
Contact me if you want to jump in!
- Digital tools are an extension of human perceptual/cognitive equipment for engagement with models (real world or digital) for interpretive and generative purposes.
- Digital tools need to further advance in support of this perceptual/cognitive equipment.
- This need is evident whether models are generated by natural or artificial intelligence (NI or AI)
- The equipment provided to date in modeling software is inadequate/underdeveloped. This inadequacy is the greatest single reason that technical drawing still supplies the majority of revenue for major commercial modeling software developers, still in 2023 after decades of modeling.
- TGN is an open source software development proposal intended to raise the adequacy of the relevant equipment within model-handling software (all kinds).
- Commercial and independent software developers are invited to join a project to make this happen. Contact me if interested.
- TGN is a minimum feature set that
- a) will make a difference and
- b) corresponds to the way human perception works in modeled worlds (real or digital).
- But TGN can be extended and added to, to the extent anyone can envision. The extensions and additions need not be open source.
- “Control Rods” for human-in-the-loop input back into AI and other computational iterative model generating systems, for human guidance, for the laying down of control parameters, control markers, control drivers, within models, for feedback back into model generating systems — you understand the idea? — are optimally hosted within TGN rigs, within models.
You can see why, right? You can imagine the development of a tremendous variety of such controls, and the fact that these would be made very easily accessible, visible, close at hand, and intelligible, within the context of these rigs.
Let your imagination loose. What control rods would you embed there to guide generative (AI or NI generated) development of the model?
What’s after that for TGN?
TGN will evolve far beyond its proposed open source core feature set. I have so many ideas for the broader attention focusing rigs (AFR) concept. No doubt many will have many such ideas.
Once TGN exists in software, people will expand it. I certainly don’t believe all the additions, extensions, and enhancements to the core TGN features have to be open source. They can be domain and app specific and proprietary.
But everyone doing that will benefit from the open standardized core feature set TGN. It gives a great foundation to build from.
Here’s a comment from an engineer on LinkedIn:
Wow, that seems interesting. How does it work with existing models?
comment on linkedin
In existing models and work in progress models, TGN would work great. Anyone could add new TGN rigs to any models.
In my TGN developer spec (download links here) there is a note also on upgrading existing BIM views to TGN.
There is also the optional use of drawings linkable to TGN rigs. The implementation details depend on what each modeling app dev team decides to implement.
But there is a proposed common core feature set (listed above, and throughout the Tangerine website)
TGN is a proposed for use in all modeling apps and platforms.
All developers are invited to do this.
The goal now is:
1. Find more developers and users who want TGN in their apps.
2. Form an open source developer community
3. Build an open source code base by moving developers from thinking about TGN to actually building it.
4. Seek a home for the community and codebase at any of the relevant open source software foundations
5. help others get TGN in their apps, let them join the dev community and contribute to the codebase
6. Help those who want to extend beyond the core TGN features but built on top of the core (and around it)
More details in the TGN features are here:
And something to think about here:
To clarify just a bit more, for developers and users:
Look again at this diagram:

The thing I want to draw attention to is that there is this BLUE layer at the top, which is the code that people have in their apps already, for cameras, for extra graphics, for scope boxes, for linking files, etc,, code they already have in their apps for those features.
But what they don’t have is:
- a package of those features aggregated together for coherent function (the function is; “Hey look here at this” and then you articulate something very clearly).
- an industry standard neutral description of each of those features, That’s the ORANGE layer, the TGN OPEN CODE. The open source project will consist of the community agreeing on neutral definitions for these features (the pioneers will have a big head start on defining that). And then each developer can take the TGN OPEN CODE and build their own transformation code (the middle layer of the sandwich) to transform UP or DOWN between the Orange layer and the blue layer.
This is how the portability will work.
BUT EQUALLY IMPORTANT!:
There is great opportunity for each developer, as far as they can imagine, to do unique proprietary things in their blue layer for each feature. Plus add extra features (features 9, 10, 11…30…) around the TGN OPEN CORE set of 8 features.
The OPEN CORE is just the lowest common denominator minimum core feature set, for portability and as a standardized base to rely on, and build on top of, and around
One last thing about that.
On feature 7, extra graphics, for example… The transformation layer (middle of sandwich between blue and orange), its job is to sink the blue graphics — the ones you make by whatever methods you have in your app, could be direct authoring of text and dims and vectors or whatever, or it could be a link to external graphics like cad files or iPad illustration apps of whatever — down to the TGN OPEN CODE, which for feature 7 means transformation of all extra graphics down to SVG format, because SVG is an open standard that anyone can implement to display those graphics in their app.
But in the BLUE top of the sandwich, everybody can do whatever they want. So, for feature 7 extra graphics, check out this inspiration for extra graphics authoring some developers might choose to develop, unique to their apps — with a transformation layer sinking them down to the TGN OPEN CODE, so at least these fancy-in-your-app extra graphics would look right with adequate graphics fidelity when your rigs are shared to other apps, at least, say, from the NORMAL direction of each rig, with these special extra graphics flattened to SVG at that primary position within the rig
I’m just saying this:
Everyone can develop pure graphics power as much as they can imagine, like the link below, but also plan for how they want to transform their unique implementations down to the lowest common denominator industry standard TGN OPEN CODE:
Here’s pure power for extra graphics, feature 7. Pete Townsend gets it:
Leave a Reply