650
I Use This!
Low Activity

News

Analyzed about 6 hours ago. based on code collected about 11 hours ago.
Posted over 14 years ago by Jason Dentler
First, NHibernate 3.0 Cookbook is now a Packt Publishing best seller. Thank you everyone who bought a copy. The NHibernate project gets a portion of each and every sale. Yesterday, Fabio announced the release of NHibernate 3.0 General Availability. ... [More] Go get it! The previous official release of NHibernate was version 2.1.2, just over 1 year ago. Since then, the team has made a ton of improvements and bug fixes. Most importantly, NHibernate now targets .NET 3.5, allowing us to use lambda expressions and LINQ. This has led to an explosion of new ways to configure and query. There are a few very minor breaking changes mentioned in the release notes: [NH-2392] ICompositeUserType.NullSafeSet method signature has changed [NH-2199] null values in maps/dictionaries are no longer silently ignored/deleted [NH-1894] SybaseAnywhereDialect has been removed, and replaced with SybaseASA9Dialect. Sybase Adaptive Server Enterprise (ASE) dialects removed. [NH-2251] Signature change for GetLimitString in Dialect [NH-2284] Obsolete members removed [NH-2358] DateTimeOffset type now works as a DateTimeOffset instead a "surrogate" of DateTime Plans for version 3.1 include additional bug fixes and patches, as well as enhancements for the new LINQ provider. As Fabio says, Happy Persisting! [Less]
Posted over 14 years ago by Patrick Smacchia
Patrick Smacchia writing. I am not a NH developer but the creator of a static analysis tool for .NET developer: NDepend. I recently analyzed NH v3.0.0 Candidate Release 1 with NDepend and I had a chance to discuss some results with NH developer Fabio ... [More] Maulo. Fabio suggested me to show some results on the NH blog, so here it is. NDepend generated a report by analyzing NH v3.0.0 CR1 code base. See the report here. NDepend has also the ability to show static analysis results live, inside Visual Studio. The live results are richer than the static report results. Here, I will mostly focus on results extracted from the report, but a few additional results will be obtained from the richer NDepend live capabilities.   Code Size NH code base weights almost 63K Lines of Code (LoC as defined here). Developers hates LoC as a productivity yardstick measurement, but it doesn't mean that the LoC code metric is useless. LoC represents a great way to compare code base size and gets an idea of the overall development effort. In the report namespace metrics section, we can see that the namespace NHibernate.Hql.Ast.ANTLR.* generated by ANTLR weights around 18K LoC. So we can consider that NH handcrafted code weights 45 LoC. Now we have a number to compare to the 19K LoC of NUnit, the 28K LoC of CC.NET, the 32K LoC of Db4o, the 110K LoC of NDepend, the roughly 130 KLoC of Llblgen, the roughly 500K LoC (or so) of R# (that certainly contains a significant portion of generated code) and the roughly 2M LoC of the .NET Fx 4. So not only NH is one of the most successful OSS initiative, it is also one of the biggest OSS code base. To quote one NH contributor, NH is a big beast!   Assembly Partitioning NH is packaged in a single NHibernate.dll assembly. I am a big advocate of reducing the number of assemblies and one assembly seems an ideal number. This way: Projects consumers of NH just need to link, maintain the reference to just one assembly. This is a very good thing compared to many other OSS Fx that force to reference, maintain many assemblies. Compilation time is much (much) faster. Compilation time of one single VS project can be easily 10 times faster than the compilation time of the same code base partitioned in many VS projects. Startup-time of an application using NH is a bit faster. Indeed, the CLR comes with a slight overhead for each extra assemblies to load at runtime. On the dependency graph or dependency matrix diagrams of the report, I can see that the NH assembly is linking 3 extra assemblies that needs to be redistributed as well: Antlr3.Runtime, Remotion.Data.Linq, and Iesi.Collections.   Code Coverage and NH Code Correctness The report shows the number 75.93% code coverage ratio. This is an excellent score, especially taken account the large code size. I consider code coverage ratio as the queen the of the code quality metrics. The higher it is, the less likely it is to release a bug in production. However things are not so simple.  High code coverage ratio matters if (and only if) the number of checks performed while running unit tests is also high. These checks are usually done in test code (through API like Assert.IsTrue(...) ). But few developers realize that checks have the same value if they are done in the code tested itself through the API Debug.Assert(...) or through the new Microsoft Code Contract API. The two important things is that checks (or contract if you prefer) must not slow down execution, and must fail abruptly when the condition is violated. I can quickly see that NH doesn't use Debug.Assert(...) nor the new Microsoft Code Contract API. But on the other hands I can see that NH comes with 2735 unit tests, all successfully executed. This significant number, coupled with the 75,93% code coverage ratio, advocate for an excellent testing plan for NH. To quote one NH contributor I talked with once:  NH is very hard to break! (but by using code contracts and striving for an even higher code coverage ratio it would be even harder to break). An another and obvious reason why NH code is rock solid, is related to the huge NH community size, that can be counted in hundred of thousands of developers and projects. In this condition, any bug has very few chances to live for a long time.   Code Architecture Most of .NET developers consider (wrongly IMHO) that .NET code must be componentized through .NET assembly (meaning through VS projects). As discussed above, having very few assemblies comes with important benefits. The essential point is that assemblies are physical artifacts while components are logical artifacts. Hence assembly partitioning must be driven by physical reasons (like lazy-code loading or an addin system). Nevertheless a 63K LoC code base needs a solid architecture. A solid architecture is the key for high code maintainability. How to define components in .NET code? Personally my preference goes to the usage of namespaces to define component. This way of doing comes wit many advantages: namespaces are logical artifacts, namespaces can be structured hierarchically, architecture explorer tooling can deal out-of-the-box with namespaces, namespaces are supported at language-level and namespaces can be used to draw explicit and concrete boundaries. In a framework such as NH, namespaces are essentially used to organize the public API. This way of doing is not incompatible with componentizing the code through namespaces. But in the case of NH, the project inherited the API structure of the Hibernate project in the Java sphere. The former Hibernate project doesn't rely on code componentization through namespaces, so NH doesn't as well. And there is no hope for any refactoring : this would result in a fatal tsunami of breaking changes in the NH public API. So NH code base has no obvious (at least to me) nor explicit componentization. I know there are architecture guidelines that NH contributors must learn, understand and follow, but sitting outside of the project, I cannot easily figure them out. Code Quality If you look back at the report, you'll see many typical Code Quality rules violated. As said, I consider Code Coverage ratio as the queen of code quality rules, but that doesn't mean that other code quality metrics don't matter. So I can see through the rule Methods too complex - critical (ILCyclomaticComplexity) two dozens of awfully complex methods.  Most of them seems to be generated by ANTLR . So there is room here to refine the NDepend Code Query Rule to exclude this generated code, like for example... // <Name>Methods too complex - critical (ILCyclomaticComplexity)</Name>WARN IF Count > 0 IN SELECT METHODS OUT OF NAMESPACES "NHibernate.Hql.Ast.ANTLR" WHERE   ILCyclomaticComplexity > 40 AND   ILNestingDepth > 4   ORDER BY ILCyclomaticComplexity DESC ...and see than now only 3 handcrafted methods are matched (one of those, NHibernate.Cfg.Configuration.GenerateSchemaUpdateScript(Dialect,DatabaseMetadata) has 49 lines of code, a Cyclomatic Complexity of 25 and is 87% covered by tests). The rule violated Methods with too many parameters - critical (NbParameters) is more a concern since we can see here a redundant code smell of having many constructors with plenty of parameters (up to 22 parameters for the ctor of the class NHibernate.Cfg.Mappings). The rule violated Type should not have too many responsibilities (Efferent Coupling) seems to me another concern. It exhibits several god classes, meaning classes with too many responsibilities. Here NDepend bases its measure on the Efferent Coupling code metric, that represents, the number of other types a type is using. The notion of class responsibility is a bit abstract, it is often translated to the tenet: a class should have only one reason to change which is still abstract in my opinion. Obviously the higher the Efferent Coupling, the more likely a class has too many responsibilities. God classes often result from the lack of refactoring during project evolution, iterations after iterations. The god class represented an initial clear concept that has evolved without appropriate refactoring, and developers got used to live with this code smell. In the context of a public framework such as NH, refactoring a public god class or interface might be not and option if this implies unacceptable API public breaking changes. The rule violated Complex methods should be 100% covered by tests exhibits a few hundreds of relatively complex methods not thoroughly covered by tests. Here also a lot of these methods belong to NHibernate.Hql.Ast.ANTLR and by filtering them, we still have more than 200 matches. This fact is a concern because having high code coverage ratio is not enough. What is important is to have a lot of methods and classes, 100% covered by tests. Indeed, empirically I noticed that: code that is hard to test is often code that contains subtle and hard to find bugs. Unfortunately, the 10% code hard to test is the code that demands more than 50% of test writing resources. We could continue to enumerate one by one Code Quality rules violated. The truth is that any sufficiently large code base contains thousands of violation of most basic code quality rules. An important decision must be taken to care for code quality before the code becomes so messy that it discourage developers to work on it (and to be honest, I had feedback from two NH contributors that left the project, partly for that reason). Once again, the NH situation here is more the rule than the exception and I'd say that if you are a real-world developer yourself, there are 9 chances on 10 that you are not satisfied by the code quality of the everyday code base you are working on. The problem when deciding to begin to care for code quality is that tooling like NDepend or FxCop reports literally thousands of flaws. However, a tool like NDepend makes things easier through its support for baseline. Concretely one can decide to continuously compare the code base against, say, the last release, and then fix flaws only on code refactored or added since the baseline. This way the team follow the rule if it's not broken don't fix it and it achieves better and better code quality without significant effort. Concretely a CQL rule that should take account of the baseline can be refactored as easily as: // <Name>From now, all methods added or refactored should not be too complex</Name>WARN IF Count > 0 IN SELECT METHODS WHERE// Match methods new or modified since Baseline for Comparison...  (WasAdded OR CodeWasChanged) AND// ...that are too complex  CyclomaticComplexity > 10 Code Evolution And this was a good transition to the last part I'd like to comment: Code Diff. As said NDepend can compare 2 versions of a code base and in the report we compared NH v3.0.0.CR1 with v2.1.2.GA. The rule API Breaking Changes: Types seems to exhibit a few matches: // <Name>API Breaking Changes: Types</Name>WARN IF Count > 0 IN SELECT TYPESWHERE IsPublic AND (VisibilityWasChanged OR WasRemoved) Types like NHibernate.Dialect.SybaseAnywhereDialect, NHibernate.Cache.ISoftLock or NHibernate.Cfg.ConfigurationSchema.ClassCacheUsage were public types that have either be removed, renamed, or set to internal types. Also we can see that some public interfaces such as, NHibernate.Proxy.IProxyFactory or NHibernate.Hql.IQueryTranslator have been changed. This can break client code if these interfaces were meant to be implemented by clients. In the Code diff report section, the query Public Types added and also Namespaces added represent a mean to list new features added to NH v3. // <Name>Public Types added</Name>SELECT TYPES WHERE WasAdded AND IsPublic Here, we mostly see the prominent new NH v3 linq capabilities through the numerous NHibernate.Linq.* namespaces added, and we can also focus on the many secondary featurettes like NHibernate.SqlTypes.XmlSqlType or NHibernate.Transaction.AdoNetWithDistributedTransactionFactory.     [Less]
Posted over 14 years ago by Jason Dentler
(From my personal blog @ http://jasondentler.com) I had hoped to include a CQRS-related recipe in the Data Access Layer chapter of my book. Of course, not having any real world CQRS experience myself, I couldn't offer any authoritative guidance. Now ... [More] that I have some free time, I'm determined to remedy that situation. I won't go in to the specifics of CQRS or even event sourcing. The internet already has plenty of people explaining it better than I ever could. If you're like me, you need code to learn. You need to hack away at something for a few days before you really get it. In the spirit of "learning in the open," I'm sharing this weekend's effort to fix up Greg Young's Simple CQRS example. His solution is called "SimplestPossibleThing.sln" which describes it perfectly. It's a great learning tool, but it's all built on top of in-memory collections, not persistent storage. In this post, I'm going to make his event store persistent. With some luck, I'll move on to the read model and bring it full circle in a later post. Before we dive in, take a look at Greg's in-memory implementation. There's a few things to note: Rather than persisting the actual events, he’s “persisting” EventDescriptor structs with references to the Events. I’m going to steal this idea to make our NHibernate code easier. The expectedVersion parameter should match the version of the most recent event. When it doesn’t, we know we have a concurrency violation. A persistent event store First, let's do a little refactoring of the EventStore implementation: public abstract class BaseEventStore : IEventStore { private readonly IEventPublisher _publisher; protected BaseEventStore(IEventPublisher publisher) { _publisher = publisher; } public void SaveEvents(Guid aggregateId, IEnumerable<Event> events, int expectedVersion) { var eventDescriptors = new List<EventDescriptor>(); var i = expectedVersion; foreach (var @event in events) { i++; @event.Version = i; eventDescriptors.Add(new EventDescriptor(aggregateId, @event, i)); } AddEvents(eventDescriptors, aggregateId, expectedVersion); foreach (var @event in events) { _publisher.Publish(@event); } } public List<Event> GetEventsForAggregate(Guid aggregateId) { var eventDescriptors = GetEventDescriptorsForAggregate(aggregateId); if (null == eventDescriptors || !eventDescriptors.Any()) { throw new AggregateNotFoundException(); } return eventDescriptors.Select(desc => desc.EventData).ToList(); } protected abstract IEnumerable<EventDescriptor> LoadEventDescriptorsForAggregate(Guid aggregateId); protected abstract void PersistEventDescriptors( IEnumerable<EventDescriptor> newEventDescriptors, Guid aggregateId, int expectedVersion); } Concurrency violation checking Greg's implementation explicitly checked for concurrency violations before persisting. Since he's working in memory, it's a simple check and a cheap operation. With a database, it gets more complicated. We could lock and query for the max version, but that's extreme and unnecessary. We assume that the expectedVersion value is not greater than the actual current version. Since we're not deleting events from the event stream, I think this is a safe assumption. Essentially, while there's a small chance someone may have done something to our aggregate, they'll never undo something from our aggregate. We can rely on our database for the check. If we insert an event with version 2 after events 0, 1, 2, and 3 are written, we'll get a primary key constraint violation. Since this is the only PK in our database, we know exactly why this happened. We'll convert this to a ConcurrencyException. Persistence implementation Now we have a base class that handles the transformation and event publishing and lets us implement our own persistence. public class NHibernateEventStore : BaseEventStore { private readonly IStatelessSession _session; public NHibernateEventStore( IEventPublisher publisher, IStatelessSession session) : base(publisher) { _session = session; } protected override IEnumerable<EventDescriptor> LoadEventDescriptorsForAggregate(Guid aggregateId) { var query = _session.GetNamedQuery("LoadEventDescriptors") .SetGuid("aggregateId", aggregateId); return Transact(() => query.List<EventDescriptor>()); } protected override void PersistEventDescriptors( IEnumerable<EventDescriptor> newEventDescriptors, Guid aggregateId, int expectedVersion) { // Don't bother to check expectedVersion. Since we can't delete // events, we won't skip a version. If we do have a true concurrency // violation, we'll get a PK violation exception. // SqlExceptionConverter will change it to a ConcurrencyViolation. Transact(() => { foreach (var ed in newEventDescriptors) _session.Insert(ed); }); } protected virtual TResult Transact<TResult>(Func<TResult> func) { if (!_session.Transaction.IsActive) { // Wrap in transaction TResult result; using (var tx = _session.BeginTransaction()) { result = func.Invoke(); tx.Commit(); } return result; } // Don't wrap; return func.Invoke(); } protected virtual void Transact(Action action) { Transact<bool>(() => { action.Invoke(); return false; }); } } We’re using stateless sessions because it’s easy. We don’t need a big unit of work implementation, automatic dirty checking, lazy loading, or any of that other stuff we rely on for traditional applications. We’re just stuffing rows in to a table. For those of you who’ve read my book, the Transact methods are taken right from the first section of my Data Access Layer chapter. They let us manage the NHibernate transaction when we need to, and handle it for us when we don’t. Query and Model The LoadEventDescriptors query is dead simple: <?xml version="1.0" encoding="utf-8" ?> <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2"> <query name="LoadEventDescriptors"> <![CDATA[ from EventDescriptor ed where ed.Id = :aggregateId order by ed.Version ]]> </query> </hibernate-mapping> Next, we redesign the EventDescriptor for use with NHibernate. public class EventDescriptor { public Event EventData { get; private set; } public Guid Id { get; private set; } public int Version { get; private set; } public EventDescriptor(Guid id, Event eventData, int version) { EventData = eventData; Version = version; Id = id; } private EventDescriptor() { } public override bool Equals(object obj) { return Equals(obj as EventDescriptor); } public bool Equals(EventDescriptor other) { return null == other ? false : other.Id == Id && other.Version == Version; } public override int GetHashCode() { return Id.GetHashCode() ^ Version.GetHashCode(); } } We've switched from a struct to a class, converted the readonly fields to properties with private setters, added a private constructor, and implemented Equals and GetHashCode. We did all of this to make NHibernate happy. We won't be doing any lazy loading, so we don't need to make our properties virtual. Because we'll use a composite key (Id and Version), we need to override Equals and GetHashCode. Here’s our mapping for EventDescriptor: <?xml version="1.0" encoding="utf-8" ?> <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" assembly="SimpleCQRS" namespace="SimpleCQRS.EventStore"> <typedef class="SimpleCQRS.EventStore.NHibernate.JsonType, SimpleCQRS.EventStore.NHibernate" name="json" /> <class name="EventDescriptor" table="Events" mutable="false" lazy="false"> <composite-id> <key-property name="Id" /> <key-property name="Version" /> </composite-id> <property name="EventData" type="json" > <column name="Type"/> <column name="Data"/> </property> </class> </hibernate-mapping> EventDescriptor is immutable. We’ve disabled lazy loading. Our primary key is a composite of the Id and Version. Our EventData is stored in two columns. The first stored the assembly qualified name of the .NET type. The second column stores the event as json serialized data. We use the JsonType IUserType to handle the serialization and deserialization transparently. Newtonsoft json.Net does all of the heavy lifting. [Serializable] public class JsonType : IUserType { private static object Deserialize(string data, string type) { return Deserialize(data, TypeNameHelper.GetType(type)); } private static object Deserialize(string data, Type type) { return JsonConvert.DeserializeObject(data, type); } private static string Serialize(object value) { return null == value ? null : JsonConvert.SerializeObject(value); } private static string GetType(object value) { return null == value ? null : TypeNameHelper.GetSimpleTypeName(value); } public object NullSafeGet(IDataReader rs, string[] names, object owner) { int typeIndex = rs.GetOrdinal(names[0]); int dataIndex = rs.GetOrdinal(names[1]); if (rs.IsDBNull(typeIndex) || rs.IsDBNull(dataIndex)) { return null; } var type = (string) rs.GetValue(typeIndex); var data = (string) rs.GetValue(dataIndex); return Deserialize(data, type); } public void NullSafeSet(IDbCommand cmd, object value, int index) { if (value == null) { NHibernateUtil.String.NullSafeSet(cmd, null, index); NHibernateUtil.String.NullSafeSet(cmd, null, index + 1); return; } var type = GetType(value); var data = Serialize(value); NHibernateUtil.String.NullSafeSet(cmd, type, index); NHibernateUtil.String.NullSafeSet(cmd, data, index + 1); } public object DeepCopy(object value) { return value == null ? null : Deserialize(Serialize(value), GetType(value)); } public object Replace(object original, object target, object owner) { return original; } public object Assemble(object cached, object owner) { var parts = cached as string[]; return parts == null ? null : Deserialize(parts[1], parts[0]); } public object Disassemble(object value) { return (value == null) ? null : new string[] { GetType(value), Serialize(value) }; } public SqlType[] SqlTypes { get { return new[] { SqlTypeFactory.GetString(10000), // Type SqlTypeFactory.GetStringClob(10000) // Data }; } } public Type ReturnedType { get { return typeof(Event); } } public bool IsMutable { get { return false; } } public new bool Equals(object x, object y) { if (ReferenceEquals(x, y)) { return true; } if (ReferenceEquals(null, x) || ReferenceEquals(null, y)) { return false; } return x.Equals(y); } public int GetHashCode(object x) { return (x == null) ? 0 : x.GetHashCode(); } } TypeNameHelper still needs some work. GetSimpleTypeName should strip out the version, public key, processor architecture, and all that goo from the assembly qualified name. public static class TypeNameHelper { public static string GetSimpleTypeName(object obj) { return null == obj ? null : obj.GetType().AssemblyQualifiedName; } public static Type GetType(string simpleTypeName) { return Type.GetType(simpleTypeName); } } Finally, we need a bit of NHibernate magic to convert to primary key constraint violations in to ConcurrencyExceptions. I probably could have made this simpler, but it works. public class SqlExceptionConverter : ISQLExceptionConverter { public Exception Convert(AdoExceptionContextInfo exInfo) { var dbException = ADOExceptionHelper.ExtractDbException(exInfo.SqlException); var ns = dbException.GetType().Namespace ?? string.Empty; if (ns.ToLowerInvariant().StartsWith("system.data.sqlite")) { // SQLite exception switch (dbException.ErrorCode) { case -2147467259: // Abort due to constraint violation throw new ConcurrencyException(); } } if (ns.ToLowerInvariant().StartsWith("system.data.sqlclient")) { // MS SQL Server switch (dbException.ErrorCode) { case -2146232060: throw new ConcurrencyException(); } } return SQLStateConverter.HandledNonSpecificException(exInfo.SqlException, exInfo.Message, exInfo.Sql); } } Fabio has a blog post all about NHibernate’s SQLExceptionConverter. To turn this on, just set the sql_exception_converter property in your NHibernate configuration. While I was working on this, I ran in to NH-2020, despite being resolved. Basically, batching and the SQL exception converter don’t mix, so turn off batching. I told Fabio about it. I’ll do what I can to get it fixed for good in NH 3 GA. Thanks to Greg Young for all his efforts to teach the world CQRS through CQRSInfo.com, including his 6 1/2 hour screen cast. Also, thank you Fabio for sharing your json user type with me and answering my questions. [Less]
Posted over 14 years ago by Paulo Roberto Quicoli
Hi Friends!  Next October 26th I'll be introducing NHibernate for Delphi users in the Delphi Conference day. It's a free event, you just have to register  for it. So, if you will be here in Brazil, make a visit Event website: http://latam.embarcadero.com/br/delphiconference/ See you there!
Posted over 14 years ago by Fabio Maulo
This week Packt published the NHibernate 3.0 Cookbook. Packt are offering all members of the NHibernate community 20% off the book. The book helps users to get solutions to common NHibernate problems to develop high-quality performance-critical ... [More] data access applications. The book, which is 328 pages long, contains quick-paced self-explanatory recipes organized in progressive skill levels and functional areas. Overview of NHibernate 3.0 Cookbook Master the full range of NHibernate features Reduce hours of application development time and get better application architecture and performance Create, maintain, and update your database structure automatically with the help of NHibernate Written and tested for NHibernate 3.0 with input from the development team distilled in to easily accessible concepts and examples Part of Packt's Cookbook series: each recipe is a carefully organized sequence of instructions to complete the task as efficiently as possible Is this book for you? This book is written for NHibernate users at all levels of experience. Examples are written in C# and XML. Some basic knowledge of SQL is assumed. To get your exclusive 20% discount when you buy through PacktPub.com, just enter the discount code NHIBCBK20 (case sensitive), to the shopping cart. Click here to read more about the NHibernate 3.0 Cookbook. Note : the customers must be logged in to PacktPub.com for discount code to be applied [Less]
Posted over 14 years ago by Jose Romaniello
A month ago I released HqlAddin Green Popotito. After talking with my friend Fabio Maulo, he convinced me to add support for hbm files. Well, that’s it, the new release add supports for hbm.xml files and I’ve absolutely dropped the support for “hql” ... [More] files (sorry but this is not the standard way of writing named queries). So, “Green popotito” is the last version with support for hql files. In this post I will talk about the features of the HqlAddin V 0.9 – Alpha 3. Syntax highlighting HqlAddin brings coloring to your queries: as you can see here: blue; for keywords dark-red; for strings blue-violet; for parameters Even inside <![CDATA[ ]]> tags: Error detection Using the same parser inside NHibernate 3, HqlAddin can detects syntactical problems: You can see the problem in three places; The red wave underline The tooltip text and in the error list, you can see the exception of ANTLR, and navigate to the line with double click This is another example of exception: Intellisense In order to get intellisense you need to do a little trick, explained in this link. If you don’t do this trick, the other two features explained above will work. I am not so happy with this code, specially with having to touch your code. I will enhance this on futures version, and maybe I will work to add another mechanism. But for now, I only have this. You can add this piece of code on any, any project, even in your data tests project. A little explanation of the process; in order to get intellisense working, HqlAddin need access to your nhibernate configuration, so you have to export (with System.ComponentModel) an instance of NHibernate.Cfg.Configuration. It doesn’t care what version of NHibernate you are using. HqlAddin will search your configuration when: You open a solution. You build a solution. The intellisense working:   This combo comes with french fries Because you are given me an instance of NHibernate’s Configuration, I know when you fail. So, I call this feature “strongly typed hbm’s”. Because, configuring NHibernate actually means “to compile mappings” This is the snapshot: This is not a desired feature just a side effect :) and its out of the scope of the addin. That’s all, as always said, I am way interested in your opinion. [Less]
Posted almost 15 years ago by Jose Romaniello
Today I have released a new version of the HqlAddin project, the version is 1.0.0.4550 – Alpha 2 (a.k.a. Green Popotito). Green Popotito comes with a new amazing feature, intellisense for properties and entities. This is a great work that comes ... [More] from Felice Pollano’s NhWorkbench, so kudos for him. Before we continue, let me show you a short screencast of the tool; follow this link. How does this work? You have to export (with MEF) your configuration in any assembly, hql addin will look the output path of the startup project and if it find an export will import the information to give you intellisense. Just write an export on any project, the build path can be anywhere. Follows the instructions here. Any configuration of any version of NHibernate is ok. In order to update your intellisense (e.g. when you change your mappings) you will have to rebuild the solution ctrl + shift + b. How to use the hql files? You can read the full guide here. Where do I get this? You can download from the main website or simply from the Visual Studio Online Gallery of your Extension Manager. Read more here. Provide feedback is mandatory Please get involved with the project, provide some feedback, thoughts, issues or anything! [Less]